Search results for: information extraction evaluation method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32275

Search results for: information extraction evaluation method

31315 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects

Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim

Abstract:

Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.

Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation

Procedia PDF Downloads 42
31314 Strabismus Detection Using Eye Alignment Stability

Authors: Anoop T. R., Otman Basir, Robert F. Hess, Ben Thompson

Abstract:

Strabismus refers to a misalignment of the eyes. Early detection and treatment of strabismus in childhood can prevent the development of permanent vision loss due to abnormal development of visual brain areas. Currently, many children with strabismus remain undiagnosed until school entry because current automated screening methods have limited success in the preschool age range. A method for strabismus detection using eye alignment stability (EAS) is proposed. This method starts with face detection, followed by facial landmark detection, eye region segmentation, eye gaze extraction, and eye alignment stability estimation. Binarization and morphological operations are performed for segmenting the pupil region from the eye. After finding the EAS, its absolute value is used to differentiate the strabismic eye from the non-strabismic eye. If the value of the eye alignment stability is greater than a particular threshold, then the eyes are misaligned, and if its value is less than the threshold, the eyes are aligned. The method was tested on 175 strabismic and non-strabismic images obtained from Kaggle and Google Photos. The strabismic eye is taken as a positive class, and the non-strabismic eye is taken as a negative class. The test produced a true positive rate of 100% and a false positive rate of 7.69%.

Keywords: strabismus, face detection, facial landmarks, eye segmentation, eye gaze, binarization

Procedia PDF Downloads 76
31313 Evaluation Method for Fouling Risk Using Quartz Crystal Microbalance

Authors: Natsuki Kishizawa, Keiko Nakano, Hussam Organji, Amer Shaiban, Mohammad Albeirutty

Abstract:

One of the most important tasks in operating desalination plants using a reverse osmosis (RO) method is preventing RO membrane fouling caused by foulants found in seawater. Optimal design of the pre-treatment process of RO process for plants enables the reduction of foulants. Therefore, a quantitative evaluation of the fouling risk in pre-treated water, which is fed to RO, is required for optimal design. Some measurement methods for water quality such as silt density index (SDI) and total organic carbon (TOC) have been conservatively applied for evaluations. However, these methods have not been effective in some situations for evaluating the fouling risk of RO feed water. Furthermore, stable management of plants will be possible by alerts and appropriate control of the pre-treatment process by using the method if it can be applied to the inline monitoring system for the fouling risk of RO feed water. The purpose of this study is to develop a method to evaluate the fouling risk of RO feed water. We applied a quartz crystal microbalance (QCM) to measure the amount of foulants found in seawater using a sensor whose surface is coated with polyamide thin film, which is the main material of a RO membrane. The increase of the weight of the sensor after a certain length of time in which the sample water passes indicates the fouling risk of the sample directly. We classified the values as “FP: Fouling Potential”. The characteristics of the method are to measure the very small amount of substances in seawater in a short time: < 2h, and from a small volume of the sample water: < 50mL. Using some RO cell filtration units, a higher correlation between the pressure increase given by RO fouling and the FP from the method than SDI and TOC was confirmed in the laboratory-scale test. Then, to establish the correlation in the actual bench-scale RO membrane module, and to confirm the feasibility of the monitoring system as a control tool for the pre-treatment process, we have started a long-term test at an experimental desalination site by the Red Sea in Jeddah, Kingdom of Saudi Arabia. Implementing inline equipment for the method made it possible to measure FP intermittently (4 times per day) and automatically. Moreover, for two 3-month long operations, the RO operation pressure among feed water samples of different qualities was compared. The pressure increase through a RO membrane module was observed at a high FP RO unit in which feed water was treated by a cartridge filter only. On the other hand, the pressure increase was not observed at a low FP RO unit in which feed water was treated by an ultra-filter during the operation. Therefore, the correlation in an actual scale RO membrane was established in two runs of two types of feed water. The result suggested that the FP method enables the evaluation of the fouling risk of RO feed water.

Keywords: fouling, monitoring, QCM, water quality

Procedia PDF Downloads 212
31312 Online Topic Model for Broadcasting Contents Using Semantic Correlation Information

Authors: Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park, Sang-Jo Lee

Abstract:

This paper proposes a method of learning topics for broadcasting contents. There are two kinds of texts related to broadcasting contents. One is a broadcasting script which is a series of texts including directions and dialogues. The other is blogposts which possesses relatively abstracted contents, stories and diverse information of broadcasting contents. Although two texts range over similar broadcasting contents, words in blogposts and broadcasting script are different. In order to improve the quality of topics, it needs a method to consider the word difference. In this paper, we introduce a semantic vocabulary expansion method to solve the word difference. We expand topics of the broadcasting script by incorporating the words in blogposts. Each word in blogposts is added to the most semantically correlated topics. We use word2vec to get the semantic correlation between words in blogposts and topics of scripts. The vocabularies of topics are updated and then posterior inference is performed to rearrange the topics. In experiments, we verified that the proposed method can learn more salient topics for broadcasting contents.

Keywords: broadcasting script analysis, topic expansion, semantic correlation analysis, word2vec

Procedia PDF Downloads 251
31311 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method

Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens

Abstract:

Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.

Keywords: healthcare, knowledge acquisition, maximal data sets, action design science

Procedia PDF Downloads 360
31310 Intrusion Detection System Using Linear Discriminant Analysis

Authors: Zyad Elkhadir, Khalid Chougdali, Mohammed Benattou

Abstract:

Most of the existing intrusion detection systems works on quantitative network traffic data with many irrelevant and redundant features, which makes detection process more time’s consuming and inaccurate. A several feature extraction methods, such as linear discriminant analysis (LDA), have been proposed. However, LDA suffers from the small sample size (SSS) problem which occurs when the number of the training samples is small compared with the samples dimension. Hence, classical LDA cannot be applied directly for high dimensional data such as network traffic data. In this paper, we propose two solutions to solve SSS problem for LDA and apply them to a network IDS. The first method, reduce the original dimension data using principal component analysis (PCA) and then apply LDA. In the second solution, we propose to use the pseudo inverse to avoid singularity of within-class scatter matrix due to SSS problem. After that, the KNN algorithm is used for classification process. We have chosen two known datasets KDDcup99 and NSLKDD for testing the proposed approaches. Results showed that the classification accuracy of (PCA+LDA) method outperforms clearly the pseudo inverse LDA method when we have large training data.

Keywords: LDA, Pseudoinverse, PCA, IDS, NSL-KDD, KDDcup99

Procedia PDF Downloads 226
31309 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology

Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James

Abstract:

Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.

Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing

Procedia PDF Downloads 133
31308 Information Theoretic Approach for Beamforming in Wireless Communications

Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif

Abstract:

Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.

Keywords: beamforming, interference, mutual information, wireless communications

Procedia PDF Downloads 280
31307 A Fuzzy Approach to Liver Tumor Segmentation with Zernike Moments

Authors: Abder-Rahman Ali, Antoine Vacavant, Manuel Grand-Brochier, Adélaïde Albouy-Kissi, Jean-Yves Boire

Abstract:

In this paper, we present a new segmentation approach for liver lesions in regions of interest within MRI (Magnetic Resonance Imaging). This approach, based on a two-cluster Fuzzy C-Means methodology, considers the parameter variable compactness to handle uncertainty. Fine boundaries are detected by a local recursive merging of ambiguous pixels with a sequential forward floating selection with Zernike moments. The method has been tested on both synthetic and real images. When applied on synthetic images, the proposed approach provides good performance, segmentations obtained are accurate, their shape is consistent with the ground truth, and the extracted information is reliable. The results obtained on MR images confirm such observations. Our approach allows, even for difficult cases of MR images, to extract a segmentation with good performance in terms of accuracy and shape, which implies that the geometry of the tumor is preserved for further clinical activities (such as automatic extraction of pharmaco-kinetics properties, lesion characterization, etc).

Keywords: defuzzification, floating search, fuzzy clustering, Zernike moments

Procedia PDF Downloads 452
31306 Green Revolution and Reckless Use of Water and Its Implication on Climate Change Leading to Desertification: Situation of Karnataka, India

Authors: Arun Das

Abstract:

One of the basic objectives of Independent India five decades ago was to meet the increasing demand for food to its growing population. Self-sufficiency was accomplished towards food production and it was attained through launching green revolution program. The green revolution repercussions were not realized at that moment. Many projects were undertaken. Especially, major and minor irrigation projects were executed to harness the river water in the dry land regions of Karnataka. In the elevated topographical lands, extraction of underground water was a solace given by the government to protect the interest of the dry land farmers whose land did not come under the command area. Free borewell digging, pump sets, and electricity were provided. Thus, the self-sufficiency was achieved. Contrary to this, the Continuous long-term extraction of water for agriculture from bore well and in the irrigated tracks has lead to two-way effect such as soil leeching (Alkalinity and Salinity), secondly, depleted underground water to incredible deeps has pushed the natural process to an un-reparable damage which in turn the nature lost to support even a tiny plants like grass to grow, discouraging human and animal habitation, Both the process is silently turning southwestern, central, northeastern and north western regions of Karnataka into desert. The grave situation of Karnataka green revolution is addressed in this paper to alert reckless use of water and also some of the suggestions are recommended based on the ground information.

Keywords: alkalinity, desertification, green revolution, salinity, water

Procedia PDF Downloads 283
31305 A Scientific Method of Drug Development Based on Ayurvedic Bhaishajya Knowledge

Authors: Rajesh S. Mony, Vaidyaratnam Oushadhasala

Abstract:

An attempt is made in this study to evolve a drug development modality based on classical Ayurvedic knowledge base as well as on modern scientific methodology. The present study involves (a) identification of a specific ailment condition, (b) the selection of a polyherbal formulation, (c) deciding suitable extraction procedure, (d) confirming the efficacy of the combination by in-vitro trials and (e) fixing up the recommended dose. The ailment segment selected is arthritic condition. The selected herbal combination is Kunturushka, Vibhitaki, Guggulu, Haridra, Maricha and Nirgundi. They were selected as per Classical Ayurvedic references, Authentified as per API (Ayurvedic Pharmacopeia of India), Extraction of each drug was done by different ratios of Hydroalcoholic menstrums, Invitro assessment of each extract after removing residual solvent for anti-Inflammatory, anti-arthritic activities (by UV-Vis. Spectrophotometer with positive control), Invitro assessment of each extract for COX enzyme inhibition (by UV-Vis. Spectrophotometer with positive control), Selection of the extracts was made having good in-vitro activity, Performed the QC testing of each selected extract including HPTLC, that is the in process QC specifications, h. Decision of the single dose with mixtures of selected extracts was made as per the level of in-vitro activity and available toxicology data, Quantification of major groups like Phenolics, Flavonoids, Alkaloids and Bitters was done with both standard Spectrophotometric and Gravimetric methods, Method for Marker assay was developed and validated by HPTLC and a good resolved HPTLC finger print was developed for the single dosage API (Active Pharmaceutical Ingredient mixture of extracts), Three batches was prepared to fix the in process and API (Active Pharmaceutical Ingredient) QC specifications.

Keywords: drug development, antiinflammatory, quality stardardisation, planar chromatography

Procedia PDF Downloads 99
31304 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring

Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti

Abstract:

Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by density-based time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., mean value, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one class classifier (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, a new anomaly detector strategy is proposed, namely one class classifier neural network two (OCCNN2), which exploit the classification capability of standard classifiers in an anomaly detection problem, finding the standard class (the boundary of the features space in normal operating conditions) through a two-step approach: coarse and fine boundary estimation. The coarse estimation uses classics OCC techniques, while the fine estimation is performed through a feedforward neural network (NN) trained that exploits the boundaries estimated in the coarse step. The detection algorithms vare then compared with known methods based on principal component analysis (PCA), kernel principal component analysis (KPCA), and auto-associative neural network (ANN). In many cases, the proposed solution increases the performance with respect to the standard OCC algorithms in terms of F1 score and accuracy. In particular, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 96% with the proposed method.

Keywords: anomaly detection, frequencies selection, modal analysis, neural network, sensor network, structural health monitoring, vibration measurement

Procedia PDF Downloads 123
31303 Assessing the Impact of Additional Information during Motor Preparation in Lane Change Task

Authors: Nikita Rajendra Sharma, Jai Prakash Kushvah, Gerhard Rinkenauer

Abstract:

Driving a car is a discrete aiming movement in which drivers aim at successful extraction of relevant information and elimination of potentially distracting one. It is the motor preparation which enables one to react to certain stimuli onsite by allowing perceptual process for optimal adjustment. Drivers prepare their responses according to the available resources of advanced and ongoing information to drive efficiently. It requires constant programming and reprogramming of the motor system. The reaction time (RT) is shorter when a response signal is preceded by a warning signal. The reason behind this reduced time in responding to targets is that the warning signal causes the participant to prepare for the upcoming response by updating the motor program before the execution. While performing the primary task of changing lanes while driving, the simultaneous occurrence of additional information during the presentation of cues (congruent or incongruent with respect to target cue) might impact the motor preparation and execution. The presence of additional information (other than warning or response signal) between warning signal and imperative stimulus influences human motor preparation to a reasonable extent. The present study was aimed to assess the impact of congruent and incongruent additional information (with respect to imperative stimulus) on driving performance (reaction time, steering wheel amplitude, and steering wheel duration) during a lane change task. implementing movement pre-cueing paradigm. 22 young valid car-drivers (Mage = 24.1+/- 3.21 years, M = 10, F = 12, age-range 21-33 years) participated in the study. The study revealed that additional information influenced the overall driving performance as potential distractors and relevant information. Findings suggest that the events of additional information relatively influenced the reaction time and steering wheel angle as potential distractor or irrelevant information. Participants took longer to respond, and higher steering wheel angles were reported for targets coupled with additional information in comparison with warning signs preceded by potential distractors and the participants' response time was more for a higher number of lanes (2 Lanes > 1 Lane). The same additional information appearing interchangeably at warning signals and targets worked as relevant information facilitating the motor programming in the trails where they were congruent with the direction of lane change direction.

Keywords: additional information, lane change task, motor preparation, movement pre-cueing, reaction time, steering wheel amplitude

Procedia PDF Downloads 191
31302 Quantification of Polychlorinated Biphenyls (PCBs) in Soil Samples of Electrical Power Substations from Different Cities in Nigeria

Authors: Omasan Urhie Urhie, Adenipekun C. O, Eke W., Ogwu K., Erinle K. O

Abstract:

Polychlorinated Biphenyls (PCBs) are Persistent organic pollutants (POPs) that are very toxic; they possess ability to accumulate in soil and in human tissues hence resulting in health issues like birth defect, reproductive disorder and cancer. The air is polluted by PCBs through volatilization and dispersion; they also contaminate soil and sediments and are not easily degraded. Soil samples were collected from a depth of 0-15 cm from three substations (Warri, Ughelli and Ibadan) of Power Holding Company of Nigeria (PHCN) where old transformers were dumped in Nigeria. Extraction and cleanup of soil samples were conducted using Accelerated Solvent Extraction (ASE) with Pressurized Liquid extraction (PLE). The concentration of PCBs was determined using gsas chromatography/mass spectrometry (GC/MS). Mean total PCB concentrations in the soil samples increased in the order Ughelli ˂ Ibadan˂ Warri, 2.457757ppm Ughelli substation 4.198926ppm, for Ibadan substation and 14.05065ppm at Warri substation. In the Warri samples, PCB-167 was the most abundant at about 30% (4.28086ppm) followed by PCB-157 at about 20% (2.77871), of the total PCB concentrations (14.05065ppm). Of the total PCBs in the Ughelli and Ibadan samples, PCB-156 was the most abundant at about 44% and 40%, respectively. This study provides a baseline report on the presence of PCBs in the vicinity of abandoned electrical power facilities in different cities in Nigeria.

Keywords: polychlorintated biphenyls, persistent organic pollutants, soil, transformer

Procedia PDF Downloads 139
31301 Filling the Gap of Extraction of Digital Evidence from Emerging Platforms Without Forensics Tools

Authors: Yi Anson Lam, Siu Ming Yiu, Kam Pui Chow

Abstract:

Digital evidence has been tendering to courts at an exponential rate in recent years. As an industrial practice, most digital evidence is extracted and preserved using specialized and well-accepted forensics tools. On the other hand, the advancement in technologies enables the creation of quite a few emerging platforms such as Telegram, Signal etc. Existing (well-accepted) forensics tools were not designed to extract evidence from these emerging platforms. While new forensics tools require a significant amount of time and effort to be developed and verified, this paper tries to address how to fill this gap using quick-fix alternative methods for digital evidence collection (e.g., based on APIs provided by Apps) and discuss issues related to the admissibility of this evidence to courts with support from international courts’ stance and the circumstances of accepting digital evidence using these proposed alternatives.

Keywords: extraction, digital evidence, laws, investigation

Procedia PDF Downloads 68
31300 Data Collection in Protected Agriculture for Subsequent Big Data Analysis: Methodological Evaluation in Venezuela

Authors: Maria Antonieta Erna Castillo Holly

Abstract:

During the last decade, data analysis, strategic decision making, and the use of artificial intelligence (AI) tools in Latin American agriculture have been a challenge. In some countries, the availability, quality, and reliability of historical data, in addition to the current data recording methodology in the field, makes it difficult to use information systems, complete data analysis, and their support for making the right strategic decisions. This is something essential in Agriculture 4.0. where the increase in the global demand for fresh agricultural products of tropical origin, during all the seasons of the year requires a change in the production model and greater agility in the responses to the consumer market demands of quality, quantity, traceability, and sustainability –that means extensive data-. Having quality information available and updated in real-time on what, how much, how, when, where, at what cost, and the compliance with production quality standards represents the greatest challenge for sustainable and profitable agriculture in the region. The objective of this work is to present a methodological proposal for the collection of georeferenced data from the protected agriculture sector, specifically in production units (UP) with tall structures (Greenhouses), initially for Venezuela, taking the state of Mérida as the geographical framework, and horticultural products as target crops. The document presents some background information and explains the methodology and tools used in the 3 phases of the work: diagnosis, data collection, and analysis. As a result, an evaluation of the process is carried out, relevant data and dashboards are displayed, and the first satellite maps integrated with layers of information in a geographic information system are presented. Finally, some improvement proposals and tentatively recommended applications are added to the process, understanding that their objective is to provide better qualified and traceable georeferenced data for subsequent analysis of the information and more agile and accurate strategic decision making. One of the main points of this study is the lack of quality data treatment in the Latin America area and especially in the Caribbean basin, being one of the most important points how to manage the lack of complete official data. The methodology has been tested with horticultural products, but it can be extended to other tropical crops.

Keywords: greenhouses, protected agriculture, data analysis, geographic information systems, Venezuela

Procedia PDF Downloads 131
31299 Improving Fake News Detection Using K-means and Support Vector Machine Approaches

Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy

Abstract:

Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.

Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine

Procedia PDF Downloads 176
31298 Pretreatment of Cattail (Typha domingensis) Fibers to Obtain Cellulose Nanocrystals

Authors: Marivane Turim Koschevic, Maycon dos Santos, Marcello Lima Bertuci, Farayde Matta Fakhouri, Silvia Maria Martelli

Abstract:

Natural fibers are rich raw materials in cellulose and abundant in the world, its use for the cellulose nanocrystals extraction is promising as an example cited is the cattail, macrophyte native weed in South America. This study deals with the pre-treatment cattail of crushed fibers, at six different methods of mercerization, followed by the use of bleaching. As a result, have found The positive effects of treating fibers by means of optical microscopy and spectroscopy, Fourier transform (FTIR). The sample selected for future testing of cellulose nanocrystals extraction was treated in 2.5% NaOH for 2 h, 60 °C in the first stage and 30vol H2O2, NaOH 5% in the proportion 30/70% (v/v) for 1 hour 60 °C, followed by treatment at 50/50% (v/v) 15 minutes, 50°C, with the same constituents of the solution.

Keywords: cellulose nanocrystal, chemical treatment, mercerization, natural fibers

Procedia PDF Downloads 293
31297 Title: Real World Evidence a Tool to Overcome the Lack of a Comparative Arm in Drug Evaluation in the Context of Rare Diseases

Authors: Mohamed Wahba

Abstract:

Objective: To build a comparative arm for product (X) in specific gene mutated advanced gastrointestinal cancer using real world evidence to fulfill HTA requirements in drug evaluation. Methods: Data for product (X) were collected from phase II clinical trial while real world data for (Y) and (Z) were collected from US database. Real-world (RW) cohorts were matched to clinical trial base line characteristics using weighting by odds method. Outcomes included progression-free survival (PFS) and overall survival (OS) rates. Study location and participants: Internationally (product X, n=80) and from USA (Product Y and Z, n=73) Results: Two comparisons were made: trial cohort 1 (X) versus real-world cohort 1 (Z), trial cohort 2 (X) versus real-world cohort 2 (Y). For first line, the median OS was 9.7 months (95% CI 8.6- 11.5) and the median PFS was 5.2 months (95% CI 4.7- not reached) for real-world cohort 1. For second line, the median OS was 10.6 months (95% CI 4.7- 27.3) for real-world cohort 2 and the median PFS was 5.0 months (95% CI 2.1- 29.3). For OS analysis, results were statistically significant but not for PFS analysis. Conclusion: This study provided the clinical comparative outcomes needed for HTA evaluation.

Keywords: real world evidence, pharmacoeconomics, HTA agencies, oncology

Procedia PDF Downloads 90
31296 An Improved Circulating Tumor Cells Analysis Method for Identifying Tumorous Blood Cells

Authors: Salvador Garcia Bernal, Chi Zheng, Keqi Zhang, Lei Mao

Abstract:

Circulating Tumor Cells (CTC) is used to detect tumoral cell metastases using blood samples of patients with cancer (lung, breast, etc.). Using an immunofluorescent method a three channel image (Red, Green, and Blue) are obtained. These set of images usually overpass the 11 x 30 M pixels in size. An aided tool is designed for imaging cell analysis to segmented and identify the tumorous cell based on the three markers signals. Our Method, it is cell-based (area and cell shape) considering each channel information and extracting and making decisions if it is a valid CTC. The system also gives information about number and size of tumor cells found in the sample. We present results in real-life samples achieving acceptable performance in identifying CTCs in short time.

Keywords: Circulating Tumor Cells (CTC), cell analysis, immunofluorescent, medical image analysis

Procedia PDF Downloads 214
31295 Improving Students’ Participation in Group Tasks: Case Study of Adama Science and Technology University

Authors: Fiseha M. Guangul, Annissa Muhammed, Aja O. Chikere

Abstract:

Group task is one method to create the conducive environment for the active teaching-learning process. Performing group task with active involvement of students will benefit the students in many ways. However, in most cases all students do not participate actively in the group task, and hence the intended benefits are not acquired. This paper presents the improvements of students’ participation in the group task and learning from the group task by introducing different techniques to enhance students’ participation. For the purpose of this research Carpentry and Joinery II (WT-392) course from Wood Technology Department at Adama Science and Technology University was selected, and five groups were formed. Ten group tasks were prepared and the first five group tasks were distributed to the five groups in the first day without introducing the techniques that are used to enhance participation of students in the group task. On another day, the other five group tasks were distributed to the same groups and various techniques were introduced to enhance students’ participation in the group task. The improvements of students’ learning from the group task after the implementation of the techniques. After implementing the techniques the evaluation showed that significant improvements were obtained in the students’ participation and learning from the group task.

Keywords: group task, students participation, active learning, the evaluation method

Procedia PDF Downloads 214
31294 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments

Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda

Abstract:

In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.

Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction

Procedia PDF Downloads 513
31293 Merging of Results in Distributed Information Retrieval Systems

Authors: Larbi Guezouli, Imane Azzouz

Abstract:

This work is located in the domain of distributed information retrieval ‘DIR’. A simplified view of the DIR requires a multi-search in a set of collections, which forces the system to analyze results found in these collections, and merge results back before sending them to the user in a single list. Our work is to find a fusion method based on the relevance score of each result received from collections and the relevance of the local search engine of each collection.

Keywords: information retrieval, distributed IR systems, merging results, datamining

Procedia PDF Downloads 336
31292 Numerical Investigation of Nanofluid Based Thermosyphon System

Authors: Kiran Kumar K., Ramesh Babu Bejjam, Atul Najan

Abstract:

A thermosyphon system is a heat transfer loop which operates on the basis of gravity and buoyancy forces. It guarantees a good reliability and low maintenance cost as it does not involve any mechanical pump. Therefore it can be used in many industrial applications such as refrigeration and air conditioning, electronic cooling, nuclear reactors, geothermal heat extraction, etc. But flow instabilities and loop configuration are the major problems in this system. Several previous researchers studied that stabilities can be suppressed by using nanofluids as loop fluid. In the present study a rectangular thermosyphon loop with end heat exchangers are considered for the study. This configuration is more appropriate for many practical applications such as solar water heater, geothermal heat extraction, etc. In the present work, steady-state analysis is carried out on thermosyphon loop with parallel flow coaxial heat exchangers at heat source and heat sink. In this loop nano fluid is considered as the loop fluid and water is considered as the external fluid in both hot and cold heat exchangers. For this analysis one-dimensional homogeneous model is developed. In this model, conservation equations like conservation of mass, momentum, energy are discretized using finite difference method. A computer code is written in MATLAB to simulate the flow in thermosyphon loop. A comparison in terms of heat transfer is made between water and nano fluid as working fluids in the loop.

Keywords: heat exchanger, heat transfer, nanofluid, thermosyphon loop

Procedia PDF Downloads 477
31291 Evaluation of MPPT Algorithms for Photovoltaic Generator by Comparing Incremental Conductance Method, Perturbation and Observation Method and the Method Using Fuzzy Logic

Authors: Elmahdi Elgharbaoui, Tamou Nasser, Ahmed Essadki

Abstract:

In the era of sustainable development, photovoltaic (PV) technology has shown significant potential as a renewable energy source. Photovoltaic generators (GPV) have a non-linear current-voltage characteristic, with a maximum power point (MPP) characterized by an optimal voltage, and depends on environmental factors such as temperature and irradiation. To extract each time the maximum power available at the terminals of the GPV and transfer it to the load, an adaptation stage is used, consisting of a boost chopper controlled by a maximum power point tracking technique (MPPT) through a stage of pulse width modulation (PWM). Our choice has focused on three techniques which are: the perturbation and observation method (P&O), the incremental conductance method (InCond) and the last is that of control using the fuzzy logic. The implementation and simulation of the system (photovoltaic generator, chopper boost, PWM and MPPT techniques) are then performed in the Matlab/Simulink environment.

Keywords: photovoltaic generator, technique MPPT, boost chopper, PWM, fuzzy logic, P&O, InCond

Procedia PDF Downloads 323
31290 Ligandless Extraction and Determination of Trace Amounts of Lead in Pomegranate, Zucchini and Lettuce Samples after Dispersive Liquid-Liquid Microextraction with Ultrasonic Bath and Optimization of Extraction Condition with RSM Design

Authors: Fariba Tadayon, Elmira Hassanlou, Hasan Bagheri, Mostafa Jafarian

Abstract:

Heavy metals are released into water, plants, soil, and food by natural and human activities. Lead has toxic roles in the human body and may cause serious problems even in low concentrations, since it may have several adverse effects on human. Therefore, determination of lead in different samples is an important procedure in the studies of environmental pollution. In this work, an ultrasonic assisted-ionic liquid based-liquid-liquid microextraction (UA-IL-DLLME) procedure for the determination of lead in zucchini, pomegranate, and lettuce has been established and developed by using flame atomic absorption spectrometer (FAAS). For UA-IL-DLLME procedure, 10 mL of the sample solution containing Pb2+ was adjusted to pH=5 in a glass test tube with a conical bottom; then, 120 μL of 1-Hexyl-3-methylimidazolium hexafluoro phosphate (CMIM)(PF6) was rapidly injected into the sample solution with a microsyringe. After that, the resulting cloudy mixture was treated by ultrasonic for 5 min, then the separation of two phases was obtained by centrifugation for 5 min at 3000 rpm and IL-phase diluted with 1 cc ethanol, and the analytes were determined by FAAS. The effect of different experimental parameters in the extraction step including: ionic liquid volume, sonication time and pH was studied and optimized simultaneously by using Response Surface Methodology (RSM) employing a central composite design (CCD). The optimal conditions were determined to be an ionic liquid volume of 120 μL, sonication time of 5 min, and pH=5. The linear ranges of the calibration curve for the determination by FAAS of lead were 0.1-4 ppm with R2=0.992. Under optimized conditions, the limit of detection (LOD) for lead was 0.062 μg.mL-1, the enrichment factor (EF) was 93, and the relative standard deviation (RSD) for lead was calculated as 2.29%. The levels of lead for pomegranate, zucchini, and lettuce were calculated as 2.88 μg.g-1, 1.54 μg.g-1, 2.18 μg.g-1, respectively. Therefore, this method has been successfully applied for the analysis of the content of lead in different food samples by FAAS.

Keywords: Dispersive liquid-liquid microextraction, Central composite design, Food samples, Flame atomic absorption spectrometry.

Procedia PDF Downloads 283
31289 Verification of Sr-90 Determination in Water and Spruce Needles Samples Using IAEA-TEL-2016-04 ALMERA Proficiency Test Samples

Authors: S. Visetpotjanakit, N. Nakkaew

Abstract:

Determination of 90Sr in environmental samples has been widely developed with several radioanlytical methods and radiation measurement techniques since 90Sr is one of the most hazardous radionuclides produced from nuclear reactors. Liquid extraction technique using di-(2-ethylhexyl) phosphoric acid (HDEHP) to separate and purify 90Y and Cherenkov counting using liquid scintillation counter to determine 90Y in secular equilibrium to 90Sr was developed and performed at our institute, the Office of Atoms for Peace. The approach is inexpensive, non-laborious, and fast to analyse 90Sr in environmental samples. To validate our analytical performance for the accurate and precise criteria, determination of 90Sr using the IAEA-TEL-2016-04 ALMERA proficiency test samples were performed for statistical evaluation. The experiment used two spiked tap water samples and one naturally contaminated spruce needles sample from Austria collected shortly after the Chernobyl accident. Results showed that all three analyses were successfully passed in terms of both accuracy and precision criteria, obtaining “Accepted” statuses. The two water samples obtained the measured results of 15.54 Bq/kg and 19.76 Bq/kg, which had relative bias 5.68% and -3.63% for the Maximum Acceptable Relative Bias (MARB) 15% and 20%, respectively. And the spruce needles sample obtained the measured results of 21.04 Bq/kg, which had relative bias 23.78% for the MARB 30%. These results confirm our analytical performance of 90Sr determination in water and spruce needles samples using the same developed method.

Keywords: ALMERA proficiency test, Cerenkov counting, determination of 90Sr, environmental samples

Procedia PDF Downloads 232
31288 Green Synthesis of Magnetic, Silica Nanocomposite and Its Adsorptive Performance against Organochlorine Pesticides

Authors: Waleed A. El-Said, Dina M. Fouad, Mohamed H. Aly, Mohamed A. El-Gahami

Abstract:

Green synthesis of nanomaterials has received increasing attention as an eco-friendly technology in materials science. Here, we have used two types of extractions from green tea leaf (i.e. total extraction and tannin extraction) as reducing agents for a rapid, simple and one step synthesis method of mesoporous silica nanoparticles (MSNPs)/iron oxide (Fe3O4) nanocomposite based on deposition of Fe3O4 onto MSNPs. MSNPs/Fe3O4 nanocomposite were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy, energy dispersive X-ray, vibrating sample magnetometer, N2 adsorption, and high-resolution transmission electron microscopy. The average mesoporous silica particle diameter was found to be around 30 nm with high surface area (818 m2/gm). MSNPs/Fe3O4 nanocomposite was used for removing lindane pesticide (an environmental hazard material) from aqueous solutions. Fourier transform infrared, UV-vis, High-performance liquid chromatography and gas chromatography techniques were used to confirm the high ability of MSNPs/Fe3O4 nanocomposite for sensing and capture of lindane molecules with high sorption capacity (more than 89%) that could develop a new eco-friendly strategy for detection and removing of pesticide and as a promising material for water treatment application.

Keywords: green synthesis, mesoporous silica, magnetic iron oxide NPs, adsorption Lindane

Procedia PDF Downloads 436
31287 Effect of Ethanol Concentration and Enzyme Pre-Treatment on Bioactive Compounds from Ginger Extract

Authors: S. Lekhavat, T. Kajsongkram, S. Sang-han

Abstract:

Dried ginger was extracted and investigated the effect of ethanol concentration and enzyme pre-treatment on its bioactive compounds in solvent extraction process. Sliced fresh gingers were dried by oven dryer at 70 °C for 24 hours and ground to powder using grinder which their size were controlled by passing through a 20-mesh sieve. In enzyme pre-treatment process, ginger powder was sprayed with 1 % (w/w) cellulase and then was incubated at 45 °C for 2 hours following by extraction process using ethanol at concentration of 0, 20, 40, 60 and 80 % (v/v), respectively. The ratio of ginger powder and ethanol are 1:9 and extracting conditions were controlled at 80 °C for 2 hours. Bioactive compounds extracted from ginger, either enzyme-treated or non enzyme-treated samples, such as total phenolic content (TPC), 6-Gingerol (6 G), 6-Shogaols (6 S) and antioxidant activity (IC50 using DPPH assay), were examined. Regardless of enzyme treatment, the results showed that 60 % ethanol provided the highest TPC (20.36 GAE mg /g. dried ginger), 6G (0.77%), 6S (0.036 %) and the lowest IC50 (625 μg/ml) compared to other ratios of ethanol. Considering the effect of enzyme on bioactive compounds and antioxidant activity, it was found that enzyme-treated sample has more 6G (0.17-0.77 %) and 6S (0.020-0.036 %) than non enzyme-treated samples (0.13-0.77 % 6G, 0.015-0.036 % 6S). However, the results showed that non enzyme-treated extracts provided higher TPC (6.76-20.36 GAE mg /g. dried ginger) and Lowest IC50 (625-1494 μg/ml ) than enzyme-treated extracts (TPC 5.36-17.50 GAE mg /g. dried ginger, IC50 793-2146 μg/ml).

Keywords: antioxidant activity, enzyme, extraction, ginger

Procedia PDF Downloads 256
31286 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 119