Search results for: extraction tool
5699 Data Mining Spatial: Unsupervised Classification of Geographic Data
Authors: Chahrazed Zouaoui
Abstract:
In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.Keywords: mining, GIS, geo-clustering, neighborhood
Procedia PDF Downloads 3755698 Solid Phase Micro-Extraction/Gas Chromatography-Mass Spectrometry Study of Volatile Compounds from Strawberry Tree and Autumn Heather Honeys
Authors: Marinos Xagoraris, Elisavet Lazarou, Eleftherios Alissandrakis, Christos S. Pappas, Petros A. Tarantilis
Abstract:
Strawberry tree (Arbutus unedo L.) and autumn heather (Erica manipuliflora Salisb.) are important beekeeping plants of Greece. Six monofloral honeys (four strawberry tree, two autumn heather) were analyzed by means of Solid Phase Micro-Extraction (SPME, 60 min, 60 oC) followed by Gas Chromatography coupled to Mass Spectrometry (GC-MS) for the purpose of assessing the botanical origin. A Divinylbenzene/Carboxen/Polydimethylsiloxane (DVB/CAR/PDMS) fiber was employed, and benzophenone was used as internal standard. The volatile compounds with higher concentration (μg/ g of honey expressed as benzophenone) from strawberry tree honey samples, were α-isophorone (2.50-8.12); 3,4,5-trimethyl-phenol (0.20-4.62); 2-hydroxy-isophorone (0.06-0.53); 4-oxoisophorone (0.38-0.46); and β-isophorone (0.02-0.43). Regarding heather honey samples, the most abundant compounds were 1-methoxy-4-propyl-benzene (1.22-1.40); p-anisaldehyde (0.97-1.28); p-anisic acid (0.35-0.58); 2-furaldehyde (0.52-0.57); and benzaldehyde (0.41-0.56). Norisoprenoids are potent floral markers for strawberry-tree honey. β-isophorone is found exclusively in the volatile fraction of this type of honey, while also α-isophorone, 4-oxoisophorone and 2-hydroxy-isophorone could be considered as additional marker compounds. The analysis of autumn heather honey revealed that phenolic compounds are the most abundant and p-anisaldehyde; 1-methoxy-4-propyl-benzene; and p-anisic acid could serve as potent marker compounds. In conclusion, marker compounds for the determination of the botanical origin for these honeys could be identified as several norisoprenoids and phenolic components were found exclusively or in higher concentrations compared to common Greek honey varieties.Keywords: SPME/GC-MS, volatile compounds, heather honey, strawberry tree honey
Procedia PDF Downloads 2005697 Computed Tomography Brain and Inpatient Falls: An Audit Evaluating the Indications and Outcomes
Authors: Zain Khan, Steve Ahn, Kathy Monypenny, James Fink
Abstract:
In Australian public hospitals, there were approximately 34,000 reported inpatient falls between 2015 to 2016. The gold standard for diagnosing intracranial injury is non-contrast enhanced brain computed tomography (CTB). Over a three-month timeframe, a total of one hundred and eighty (180) falls were documented between the hours of 4pm and 8am at a large metro hospital. Only three (3) of these scans demonstrated a positive intra-cranial finding. The rationale for scanning varied. The common indications included a fall with head strike, the presence of blood thinning medication, loss of consciousness, reduced Glasgow Coma Scale (GCS), vomiting and new neurological findings. There are several validated tools to aid in decision-making around ordering CTB scans in the acute setting, but no such accepted tool exists for the inpatient space. With further data collection, spanning a greater length of time and through involving multiple centres, work can be done towards generating such a tool that can be utilized for inpatient falls.Keywords: computed tomography, falls, inpatient, intracranial hemorrhage
Procedia PDF Downloads 1715696 The International Classification of Functioning, Disability and Health (ICF) as a Problem-Solving Tool in Disability Rehabilitation and Education Alliance in Metabolic Disorders (DREAM) at Sultan Bin Abdul Aziz Humanitarian City:A Prototype for Reh
Authors: Hamzeh Awad
Abstract:
Disability is considered to be a worldwide complex phenomenon which rising at a phenomenal rate and caused by many different factors. Chronic diseases such as cardiovascular disease and diabetes can lead to mobility disability in particular and disability in general. The ICF is an integrative bio-psycho-social model of functioning and disability and considered by the World Health Organization (WHO) to be a reference for disability classification using its categories and core set to classify disorder’s functional limitations. Specialist programs at Sultan Bin Abdul Aziz Humanitarian City (SBAHC) are providing both inpatient and outpatient services have started to implement the ICF and use it as a problem solving tool in Rehab. Diabetes is leading contributing factor for disability and considered epidemic in several Gulf countries including the Kingdom of Saudi Arabia (KSA), where its prevalence continues to increase dramatically. Metabolic disorders, mainly diabetes are not well covered in Rehab field. The purpose of this study is present to research and clinical rehabilitation field of DREAM and ICF as a framework in clinical and research setting in Rehab service. Also, shed the light on using the ICF as problem solving tool at SBAHC. There are synergies between disability causes and wider public health priorities in relation to both chronic disease and disability prevention. Therefore, there is a need for strong advocacy and understanding of the role of ICF as a reference in Rehab settings in Middle East if we wish to seize the opportunity to reverse current trends of acquired disability in the region.Keywords: international classification of functioning, disability and health (ICF), prototype, rehabilitation and diabetes
Procedia PDF Downloads 3515695 Calculate Product Carbon Footprint through the Internet of Things from Network Science
Authors: Jing Zhang
Abstract:
To reduce the carbon footprint of mankind and become more sustainable is one of the major challenges in our era. Internet of Things (IoT) mainly resolves three problems: Things to Things (T2T), Human to Things, H2T), and Human to Human (H2H). Borrowing the classification of IoT, we can find carbon prints of industries also can be divided in these three ways. Therefore, monitoring the routes of generation and circulation of products may help calculate product carbon print. This paper does not consider any technique used by IoT itself, but the ideas of it look at the connection of products. Carbon prints are like a gene or mark of a product from raw materials to the final products, which never leave the products. The contribution of this paper is to combine the characteristics of IoT and the methodology of network science to find a way to calculate the product's carbon footprint. Life cycle assessment, LCA is a traditional and main tool to calculate the carbon print of products. LCA is a traditional but main tool, which includes three kinds.Keywords: product carbon footprint, Internet of Things, network science, life cycle assessment
Procedia PDF Downloads 1165694 A Q-Methodology Approach for the Evaluation of Land Administration Mergers
Authors: Tsitsi Nyukurayi Muparari, Walter Timo De Vries, Jaap Zevenbergen
Abstract:
The nature of Land administration accommodates diversity in terms of both spatial data handling activities and the expertise involved, which supposedly aims to satisfy the unpredictable demands of land data and the diverse demands of the customers arising from the land. However, it is known that strategic decisions of restructuring are in most cases repelled in favour of complex structures that strive to accommodate professional diversity and diverse roles in the field of Land administration. Yet despite of this widely accepted knowledge, there is scanty theoretical knowledge concerning the psychological methodologies that can extract the deeper perceptions from the diverse spatial expertise in order to explain the invisible control arm of the polarised reception of the ideas of change. This paper evaluates Q methodology in the context of a cadastre and land registry merger (under one agency) using the Swedish cadastral system as a case study. Precisely, the aim of this paper is to evaluate the effectiveness of Q methodology towards modelling the diverse psychological perceptions of spatial professionals who are in a widely contested decision of merging the cadastre and land registry components of Land administration using the Swedish cadastral system as a case study. An empirical approach that is prescribed by Q methodology starts with the concourse development, followed by the design of statements and q sort instrument, selection of the participants, the q-sorting exercise, factor extraction by PQMethod and finally narrative development by logic of abduction. The paper uses 36 statements developed from a dominant competing value theory that stands out on its reliability and validity, purposively selects 19 participants to do the Qsorting exercise, proceeds with factor extraction from the diversity using varimax rotation and judgemental rotation provided by PQMethod and effect the narrative construction using the logic abduction. The findings from the diverse perceptions from cadastral professionals in the merger decision of land registry and cadastre components in Sweden’s mapping agency (Lantmäteriet) shows that focus is rather inclined on the perfection of the relationship between the legal expertise and technical spatial expertise. There is much emphasis on tradition, loyalty and communication attributes which concern the organisation’s internal environment rather than innovation and market attributes that reveals customer behavior and needs arising from the changing humankind-land needs. It can be concluded that Q methodology offers effective tools that pursues a psychological approach for the evaluation and gradations of the decisions of strategic change through extracting the local perceptions of spatial expertise.Keywords: cadastre, factor extraction, land administration merger, land registry, q-methodology, rotation
Procedia PDF Downloads 1945693 A Study on the Impacts of Computer Aided Design on the Architectural Design Process
Authors: Halleh Nejadriahi, Kamyar Arab
Abstract:
Computer-aided design (CAD) tools have been extensively used by the architects for the several decades. It has evolved from being a simple drafting tool to being an intelligent architectural software and a powerful means of communication for architects. CAD plays an essential role in the profession of architecture and is a basic tool for any architectural firm. It is not possible for an architectural firm to compete without taking the advantage of computer software, due to the high demand and competition in the architectural industry. The aim of this study is to evaluate the impacts of CAD on the architectural design process from conceptual level to final product, particularly in architectural practice. It examines the range of benefits of integrating CAD into the industry and discusses the possible defects limiting the architects. Method of this study is qualitatively based on data collected from the professionals’ perspective. The identified benefits and limitations of CAD on the architectural design process will raise the awareness of professionals on the potentials of CAD and proper utilization of that in the industry, which would result in a higher productivity along with a better quality in the architectural offices.Keywords: architecture, architectural practice, computer aided design (CAD), design process
Procedia PDF Downloads 3605692 Modeling Sediment Yield Using the SWAT Model: A Case Study of Upper Ankara River Basin, Turkey
Authors: Umit Duru
Abstract:
The Soil and Water Assessment Tool (SWAT) was tested for prediction of water balance and sediment yield in the Ankara gauged basin, Turkey. The overall objective of this study was to evaluate the performance and applicability of the SWAT in this region of Turkey. Thirteen years of monthly stream flow, and suspended sediment, data were used for calibration and validation. This research assessed model performance based on differences between observed and predicted suspended sediment yield during calibration (1987-1996) and validation (1982-1984) periods. Statistical comparisons of suspended sediment produced values for NSE (Nash Sutcliffe efficiency), RE (relative error), and R² (coefficient of determination), of 0.81, -1.55, and 0.93, respectively, during the calibration period, and NSE, RE (%), and R² of 0.77, -2.61, and 0.87, respectively, during the validation period. Based on the analyses, SWAT satisfactorily simulated observed hydrology and sediment yields and can be used as a tool in decision making for water resources planning and management in the basin.Keywords: calibration, GIS, sediment yield, SWAT, validation
Procedia PDF Downloads 2815691 Robustness of the Deep Chroma Extractor and Locally-Normalized Quarter Tone Filters in Automatic Chord Estimation under Reverberant Conditions
Authors: Luis Alvarado, Victor Poblete, Isaac Gonzalez, Yetzabeth Gonzalez
Abstract:
In MIREX 2016 (http://www.music-ir.org/mirex), the deep neural network (DNN)-Deep Chroma Extractor, proposed by Korzeniowski and Wiedmer, reached the highest score in an audio chord recognition task. In the present paper, this tool is assessed under acoustic reverberant environments and distinct source-microphone distances. The evaluation dataset comprises The Beatles and Queen datasets. These datasets are sequentially re-recorded with a single microphone in a real reverberant chamber at four reverberation times (0 -anechoic-, 1, 2, and 3 s, approximately), as well as four source-microphone distances (32, 64, 128, and 256 cm). It is expected that the performance of the trained DNN will dramatically decrease under these acoustic conditions with signals degraded by room reverberation and distance to the source. Recently, the effect of the bio-inspired Locally-Normalized Cepstral Coefficients (LNCC), has been assessed in a text independent speaker verification task using speech signals degraded by additive noise at different signal-to-noise ratios with variations of recording distance, and it has also been assessed under reverberant conditions with variations of recording distance. LNCC showed a performance so high as the state-of-the-art Mel Frequency Cepstral Coefficient filters. Based on these results, this paper proposes a variation of locally-normalized triangular filters called Locally-Normalized Quarter Tone (LNQT) filters. By using the LNQT spectrogram, robustness improvements of the trained Deep Chroma Extractor are expected, compared with classical triangular filters, and thus compensating the music signal degradation improving the accuracy of the chord recognition system.Keywords: chord recognition, deep neural networks, feature extraction, music information retrieval
Procedia PDF Downloads 2325690 Agricultural Extension Education for Female: A Tool for Sustainable Rural Development in Pakistan
Authors: Jahanzaib
Abstract:
The rural economy can be uplifted through agricultural extension education for female as the majority is uneducated. The present study was carried out in five districts (Bahawalpur, Lodhran, Raheem Yar Khan, Bahawalnagr, and Vehari) of southern Punjab, Pakistan. The ten females were selected from each district, poor economic background for agricultural training. The training was provided free of cost, through Punjab skills development program. After six month training, the trainees were awarded with certificates and a tool kit. After completion of training data was recorded and analyzed, the results indicate that, female trainees were in a better economic position than the females of nearby districts without training. From this study, we can conclude that agricultural education for female can not only improve the economy of the individual family but also improve the agriculture of Pakistan on the sustainable basis as the majority of workers are female in rural areas of Pakistan.Keywords: agricultural extension education, sustainable rural development, agriculture, rural development in Pakistan
Procedia PDF Downloads 2385689 Assessing Acute Toxicity and Endocrine Disruption Potential of Selected Packages Internal Layers Extracts
Authors: N. Szczepanska, B. Kudlak, G. Yotova, S. Tsakovski, J. Namiesnik
Abstract:
In the scientific literature related to the widely understood issue of packaging materials designed to have contact with food (food contact materials), there is much information on raw materials used for their production, as well as their physiochemical properties, types, and parameters. However, not much attention is given to the issues concerning migration of toxic substances from packaging and its actual influence on the health of the final consumer, even though health protection and food safety are the priority tasks. The goal of this study was to estimate the impact of particular foodstuff packaging type, food production, and storage conditions on the degree of leaching of potentially toxic compounds and endocrine disruptors to foodstuffs using the acute toxicity test Microtox and XenoScreen YES YAS assay. The selected foodstuff packaging materials were metal cans used for fish storage and tetrapak. Five stimulants respectful to specific kinds of food were chosen in order to assess global migration: distilled water for aqueous foods with a pH above 4.5; acetic acid at 3% in distilled water for acidic aqueous food with pH below 4.5; ethanol at 5% for any food that may contain alcohol; dimethyl sulfoxide (DMSO) and artificial saliva were used in regard to the possibility of using it as an simulation medium. For each packaging three independent variables (temperature and contact time) factorial design simulant was performed. Xenobiotics migration from epoxy resins was studied at three different temperatures (25°C, 65°C, and 121°C) and extraction time of 12h, 48h and 2 weeks. Such experimental design leads to 9 experiments for each food simulant as conditions for each experiment are obtained by combination of temperature and contact time levels. Each experiment was run in triplicate for acute toxicity and in duplicate for estrogen disruption potential determination. Multi-factor analysis of variation (MANOVA) was used to evaluate the effects of the three main factors solvent, temperature (temperature regime for cup), contact time and their interactions on the respected dependent variable (acute toxicity or estrogen disruption potential). From all stimulants studied the most toxic were can and tetrapak lining acetic acid extracts that are indication for significant migration of toxic compounds. This migration increased with increase of contact time and temperature and justified the hypothesis that food products with low pH values cause significant damage internal resin filling. Can lining extracts of all simulation medias excluding distilled water and artificial saliva proved to contain androgen agonists even at 25°C and extraction time of 12h. For tetrapak extracts significant endocrine potential for acetic acid, DMSO and saliva were detected.Keywords: food packaging, extraction, migration, toxicity, biotest
Procedia PDF Downloads 1815688 Dengue Death Review: A Tool to Adjudge the Cause of Dengue Mortality and Use of the Tool for Prevention of Dengue Deaths
Authors: Gagandeep Singh Grover, Vini Mahajan, Bhagmal, Priti Thaware, Jaspreet Takkar
Abstract:
Dengue is a mosquito-borne viral disease endemic in many countries in the tropics and sub-tropics. The state of Punjab in India shows cyclical and seasonal variation in dengue cases. The Case Fatality Rate of Dengue has ranged from 0.6 to 1.0 in the past years. The department has initiated a review of the cases that have died due to dengue in order to know the exact cause of the death in a case of dengue. The study has been undertaken to know the other associated co-morbidities and factors causing death in a case of dengue. The study used the predesigned proforma on which the records (medical and Lab) were recorded and reviewed by the expert committee of the doctors. This study has revealed that cases of dengue having co-morbidities have a longer stay in the hospital. Fluid overload and co-morbidities have been found as major factors leading to death, however, in a confirmed case of dengue hepatorenal shutdown was found to be a major cause of mortality. The data obtained will help in sensitizing the treating physicians in order to decrease the mortality due to dengue in future.Keywords: dengue, death, morbidities, DHF, DSS
Procedia PDF Downloads 3115687 Active Learning in Engineering Courses Using Excel Spreadsheet
Authors: Promothes Saha
Abstract:
Recently, transportation engineering industry members at the study university showed concern that students lacked the skills needed to solve real-world engineering problems using spreadsheet data analysis. In response to the concerns shown by industry members, this study investigated how to engage students in a better way by incorporating spreadsheet analysis during class - also, help them learn the course topics. Helping students link theoretical knowledge to real-world problems can be a challenge. In this effort, in-class activities and worksheets were redesigned to integrate with Excel to solve example problems using built-in tools including cell referencing, equations, data analysis tool pack, solver tool, conditional formatting, charts, etc. The effectiveness of this technique was investigated using students’ evaluations of the course, enrollment data, and students’ comments. Based on the data of those criteria, it is evident that the spreadsheet activities may increase student learning.Keywords: civil, engineering, active learning, transportation
Procedia PDF Downloads 1385686 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 1535685 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis
Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan
Abstract:
Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis
Procedia PDF Downloads 885684 Calibration and Validation of the Aquacrop Model for Simulating Growth and Yield of Rain-fed Sesame (Sesamum indicum L.) Under Different Soil Fertility Levels in the Semi-arid Areas of Tigray
Authors: Abadi Berhane, Walelign Worku, Berhanu Abrha, Gebre Hadgu, Tigray
Abstract:
Sesame is an important oilseed crop in Ethiopia; which is the second most exported agricultural commodity next to coffee. However, there is poor soil fertility management and a research-led farming system for the crop. The AquaCrop model was applied as a decision-support tool; which performs a semi-quantitative approach to simulate the yield of crops under different soil fertility levels. The objective of this experiment was to calibrate and validated the AquaCrop model for simulating the growth and yield of sesame under different nitrogen fertilizer levels and to test the performance of the model as a decision-support tool for improved sesame cultivation in the study area. The experiment was laid out as a randomized complete block design (RCBD) in a factorial arrangement in the 2016, 2017, and 2018 main cropping seasons. In this experiment, four nitrogen fertilizer rates; 0, 23, 46, and 69 Kg/ha nitrogen, and three improved varieties (Setit-1, Setit-2, and Humera-1). In the meantime, growth, yield, and yield components of sesame were collected from each treatment. Coefficient of determination (R2), Root mean square error (RMSE), Normalized root mean square error (N-RMSE), Model efficiency (E), and Degree of agreement (D) were used to test the performance of the model. The results indicated that the AquaCrop model successfully simulated soil water content with R2 varying from 0.92 to 0.98, RMSE 6.5 to 13.9 mm, E 0.78 to 0.94, and D 0.95 to 0.99; and the corresponding values for AB also varied from 0.92 to 0.98, 0.33 to 0.54 tons/ha, 0.74 to 0.93, and 0.9 to 0.98, respectively. The results on the canopy cover of sesame also showed that the model acceptably simulated canopy cover with R2 varying from 0.95 to 0.99, and a RMSE of 5.3 to 8.6%. The AquaCrop model was appropriately calibrated to simulate soil water content, canopy cover, aboveground biomass, and sesame yield; the results indicated that the model adequately simulated the growth and yield of sesame under the different nitrogen fertilizer levels. The AquaCrop model might be an important tool for improved soil fertility management and yield enhancement strategies of sesame. Hence, the model might be applied as a decision-support tool in soil fertility management in sesame production.Keywords: aquacrop model, sesame, normalized water productivity, nitrogen fertilizer
Procedia PDF Downloads 755683 Early Requirement Engineering for Design of Learner Centric Dynamic LMS
Authors: Kausik Halder, Nabendu Chaki, Ranjan Dasgupta
Abstract:
We present a modelling framework that supports the engineering of early requirements specifications for design of learner centric dynamic Learning Management System. The framework is based on i* modelling tool and Means End Analysis, that adopts primitive concepts for modelling early requirements (such as actor, goal, and strategic dependency). We show how pedagogical and computational requirements for designing a learner centric Learning Management system can be adapted for the automatic early requirement engineering specifications. Finally, we presented a model on a Learner Quanta based adaptive Courseware. Our early requirement analysis shows that how means end analysis reveals gaps and inconsistencies in early requirements specifications that are by no means trivial to discover without the help of formal analysis tool.Keywords: adaptive courseware, early requirement engineering, means end analysis, organizational modelling, requirement modelling
Procedia PDF Downloads 5005682 Multi-source Question Answering Framework Using Transformers for Attribute Extraction
Authors: Prashanth Pillai, Purnaprajna Mangsuli
Abstract:
Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.Keywords: natural language processing, deep learning, transformers, information retrieval
Procedia PDF Downloads 1935681 Estimation of the Curve Number and Runoff Height Using the Arc CN-Runoff Tool in Sartang Ramon Watershed in Iran
Authors: L.Jowkar. M.Samiee
Abstract:
Models or systems based on rainfall and runoff are numerous and have been formulated and applied depending on the precipitation regime, temperature, and climate. In this study, the ArcCN-Runoff rain-runoff modeling tool was used to estimate the spatial variability of the rainfall-runoff relationship in Sartang Ramon in Jiroft watershed. In this study, the runoff was estimated from 6-hour rainfall. The results showed that based on hydrological soil group map, soils with hydrological groups A, B, C, and D covered 1, 2, 55, and 41% of the basin, respectively. Given that the majority of the area has a slope above 60 percent and results of soil hydrologic groups, one can conclude that Sartang Ramon Basin has a relatively high potential for producing runoff. The average runoff height for a 6-hour rainfall with a 2-year return period is 26.6 mm. The volume of runoff from the 2-year return period was calculated as the runoff height of each polygon multiplied by the area of the polygon, which is 137913486 m³ for the whole basin.Keywords: Arc CN-Run off, rain-runoff, return period, watershed
Procedia PDF Downloads 1275680 Mechanical Properties of D2 Tool Steel Cryogenically Treated Using Controllable Cooling
Authors: A. Rabin, G. Mazor, I. Ladizhenski, R. Shneck, Z.
Abstract:
The hardness and hardenability of AISI D2 cold work tool steel with conventional quenching (CQ), deep cryogenic quenching (DCQ) and rapid deep cryogenic quenching heat treatments caused by temporary porous coating based on magnesium sulfate was investigated. Each of the cooling processes was examined from the perspective of the full process efficiency, heat flux in the austenite-martensite transformation range followed by characterization of the temporary porous layer made of magnesium sulfate using confocal laser scanning microscopy (CLSM), surface and core hardness and hardenability using Vickr’s hardness technique. The results show that the cooling rate (CR) at the austenite-martensite transformation range have a high influence on the hardness of the studied steel.Keywords: AISI D2, controllable cooling, magnesium sulfate coating, rapid cryogenic heat treatment, temporary porous layer
Procedia PDF Downloads 1375679 Efficient Video Compression Technique Using Convolutional Neural Networks and Generative Adversarial Network
Authors: P. Karthick, K. Mahesh
Abstract:
Video has become an increasingly significant component of our digital everyday contact. With the advancement of greater contents and shows of the resolution, its significant volume poses serious obstacles to the objective of receiving, distributing, compressing, and revealing video content of high quality. In this paper, we propose the primary beginning to complete a deep video compression model that jointly upgrades all video compression components. The video compression method involves splitting the video into frames, comparing the images using convolutional neural networks (CNN) to remove duplicates, repeating the single image instead of the duplicate images by recognizing and detecting minute changes using generative adversarial network (GAN) and recorded with long short-term memory (LSTM). Instead of the complete image, the small changes generated using GAN are substituted, which helps in frame level compression. Pixel wise comparison is performed using K-nearest neighbours (KNN) over the frame, clustered with K-means, and singular value decomposition (SVD) is applied for each and every frame in the video for all three color channels [Red, Green, Blue] to decrease the dimension of the utility matrix [R, G, B] by extracting its latent factors. Video frames are packed with parameters with the aid of a codec and converted to video format, and the results are compared with the original video. Repeated experiments on several videos with different sizes, duration, frames per second (FPS), and quality results demonstrate a significant resampling rate. On average, the result produced had approximately a 10% deviation in quality and more than 50% in size when compared with the original video.Keywords: video compression, K-means clustering, convolutional neural network, generative adversarial network, singular value decomposition, pixel visualization, stochastic gradient descent, frame per second extraction, RGB channel extraction, self-detection and deciding system
Procedia PDF Downloads 1875678 Prevalence and Risk Factors of Low Back Disorder among Waste Collection Workers: A Systematic Review
Authors: Benedicta Asante, Catherine Trask, Brenna Bath
Abstract:
Background: Waste Collection Workers’ (WCWs) activities contribute greatly to the recycling sector and are an important component of the waste management industry. As the recycling sector evolves, there is the increase in reports of injuries, particularly for common and debilitating musculoskeletal disorders such as low back disorder (LBD). WCWs are likely exposed to diverse work-related hazards that could contribute to LBD. However, there is currently no summary of the state of knowledge on the prevalence and risk factors of LBD within this workforce. Method: A comprehensive search was conducted in Ovid Medline, EMBASE, and Global Health e-publications with search term categories ‘low back disorder’ and ‘waste collection workers’. Two reviewers screened articles at title, abstract, and full-text stages. Data were extracted on study design, sampling strategy, socio-demographics, geographical region, and exposure definition, the definition of LBD, response rate, statistical techniques, LBD prevalence and risk factors. The risk of bias was assessed with a standardized tool. Results: The search of three databases generated 79 studies. Thirty-two studies met the study inclusion criteria for both title and abstract; only thirteen full-text articles met the study criteria and underwent data extraction. The majority of articles reported a 12-month prevalence of LBD between 16-74%. Although none of the included studies quantified relationships between risk factors and LBD, the suggested risk factors for LBD among WCWs included: awkward posture; lifting; pulling; pushing; repetitive motions; work duration; and physical loads. Conclusion: LBD is a major occupational health issue among WCWs. In light of these risks and future growth in this industry, further research should focus on the investigation of risk factors, with more focus on ergonomic exposure assessment, and LBD prevention efforts.Keywords: low back pain, scavenger, waste pickers, waste collection workers
Procedia PDF Downloads 2535677 The Acceptable Roles of Artificial Intelligence in the Judicial Reasoning Process
Authors: Sonia Anand Knowlton
Abstract:
There are some cases where we as a society feel deeply uncomfortable with the use of Artificial Intelligence (AI) tools in the judicial decision-making process, and justifiably so. A perfect example is COMPAS, an algorithmic model that predicts recidivism rates of offenders to assist in the determination of their bail conditions. COMPAS turned out to be extremely racist: it massively overpredicted recidivism rates of Black offenders and underpredicted recidivism rates of white offenders. At the same time, there are certain uses of AI in the judicial decision-making process that many would feel more comfortable with and even support. Take, for example, a “super-breathalyzer,” an (albeit imaginary) tool that uses AI to deliver highly detailed information about the subject of the breathalyzer test to the legal decision-makers analyzing their drunk-driving case. This article evaluates the point at which a judge’s use of AI tools begins to undermine the public’s trust in the administration of justice. It argues that the answer to this question depends on whether the AI tool is in a role in which it must perform a moral evaluation of a human being.Keywords: artificial intelligence, judicial reasoning, morality, technology, algorithm
Procedia PDF Downloads 815676 Software User Experience Enhancement through Collaborative Design
Authors: Shan Wang, Fahad Alhathal, Daniel Hobson
Abstract:
User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023, aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight workshops with a diverse group of 11 individuals. Throughout these sessions, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.Keywords: user experiences, co-design, design process, knowledge management tool, user-centered design
Procedia PDF Downloads 685675 Creation of a Clinical Tool for Diagnosis and Treatment of Skin Disease in HIV Positive Patients in Malawi
Authors: Alice Huffman, Joseph Hartland, Sam Gibbs
Abstract:
Dermatology is often a neglected specialty in low-resource settings, despite the high morbidity associated with skin disease. This becomes even more significant when associated with HIV infection, as dermatological conditions are more common and aggressive in HIV positive patients. African countries have the highest HIV infection rates and skin conditions are frequently misdiagnosed and mismanaged, because of a lack of dermatological training and educational material. The frequent lack of diagnostic tests in the African setting renders basic clinical skills all the more vital. This project aimed to improve diagnosis and treatment of skin disease in the HIV population in a district hospital in Malawi. A basic dermatological clinical tool was developed and produced in collaboration with local staff and based on available literature and data collected from clinics. The aim was to improve diagnostic accuracy and provide guidance for the treatment of skin disease in HIV positive patients. A literature search within Embase, Medline and Google scholar was performed and supplemented through data obtained from attending 5 Antiretroviral clinics. From the literature, conditions were selected for inclusion in the resource if they were described as specific, more prevalent, or extensive in the HIV population or have more adverse outcomes if they develop in HIV patients. Resource-appropriate treatment options were decided using Malawian Ministry of Health guidelines and textbooks specific to African dermatology. After the collection of data and discussion with local clinical and pharmacy staff a list of 15 skin conditions was included and a booklet created using the simple layout of a picture, a diagnostic description of the disease and treatment options. Clinical photographs were collected from local clinics (with full consent of the patient) or from the book ‘Common Skin Diseases in Africa’ (permission granted if fully acknowledged and used in a not-for-profit capacity). This tool was evaluated by the local staff, alongside an educational teaching session on skin disease. This project aimed to reduce uncertainty in diagnosis and provide guidance for appropriate treatment in HIV patients by gathering information into one practical and manageable resource. To further this project, we hope to review the effectiveness of the tool in practice.Keywords: dermatology, HIV, Malawi, skin disease
Procedia PDF Downloads 2035674 ARABEX: Automated Dotted Arabic Expiration Date Extraction using Optimized Convolutional Autoencoder and Custom Convolutional Recurrent Neural Network
Authors: Hozaifa Zaki, Ghada Soliman
Abstract:
In this paper, we introduced an approach for Automated Dotted Arabic Expiration Date Extraction using Optimized Convolutional Autoencoder (ARABEX) with bidirectional LSTM. This approach is used for translating the Arabic dot-matrix expiration dates into their corresponding filled-in dates. A custom lightweight Convolutional Recurrent Neural Network (CRNN) model is then employed to extract the expiration dates. Due to the lack of available dataset images for the Arabic dot-matrix expiration date, we generated synthetic images by creating an Arabic dot-matrix True Type Font (TTF) matrix to address this limitation. Our model was trained on a realistic synthetic dataset of 3287 images, covering the period from 2019 to 2027, represented in the format of yyyy/mm/dd. We then trained our custom CRNN model using the generated synthetic images to assess the performance of our model (ARABEX) by extracting expiration dates from the translated images. Our proposed approach achieved an accuracy of 99.4% on the test dataset of 658 images, while also achieving a Structural Similarity Index (SSIM) of 0.46 for image translation on our dataset. The ARABEX approach demonstrates its ability to be applied to various downstream learning tasks, including image translation and reconstruction. Moreover, this pipeline (ARABEX+CRNN) can be seamlessly integrated into automated sorting systems to extract expiry dates and sort products accordingly during the manufacturing stage. By eliminating the need for manual entry of expiration dates, which can be time-consuming and inefficient for merchants, our approach offers significant results in terms of efficiency and accuracy for Arabic dot-matrix expiration date recognition.Keywords: computer vision, deep learning, image processing, character recognition
Procedia PDF Downloads 825673 A Method to Identify Areas for Hydraulic Fracturing by Using Production Logging Tools
Authors: Armin Shirbazo, Hamed Lamei Ramandi, Mohammad Vahab, Jalal Fahimpour
Abstract:
Hydraulic fracturing, especially multi-stage hydraulic fracturing, is a practical solution for wells with uneconomic production. The wide range of applications is appraised appropriately to have a stable well-production. Production logging tool, which is known as PLT in the oil and gas industry, is counted as one of the most reliable methods to evaluate the efficiency of fractures jobs. This tool has a number of benefits and can be used to prevent subsequent production failure. It also distinguishes different problems that occurred during well-production. In this study, the effectiveness of hydraulic fracturing jobs is examined by using the PLT in various cases and situations. The performance of hydraulically fractured wells is investigated. Then, the PLT is employed to gives more information about the properties of different layers. The PLT is also used to selecting an optimum fracturing design. The results show that one fracture and three-stage fractures behave differently. In general, the one-stage fracture should be created in high-quality areas of the reservoir to have better performance, and conversely, in three-stage fractures, low-quality areas are a better candidate for fracturingKeywords: multi-stage fracturing, horizontal well, PLT, fracture length, number of stages
Procedia PDF Downloads 1945672 Carbide Structure and Fracture Toughness of High Speed Tool Steels
Authors: Jung-Ho Moon, Tae Kwon Ha
Abstract:
M2 steels, the typical Co-free high speed steel (HSS) possessing hardness level of 63~65 HRc, are most widely used for cutting tools. On the other hand, Co-containing HSS’s, such as M35 and M42, show a higher hardness level of 65~67 HRc and used for high quality cutting tools. In the fabrication of HSS’s, it is very important to control cleanliness and eutectic carbide structure of the ingot and it is required to increase productivity at the same time. Production of HSS ingots includes a variety of processes such as casting, electro-slag remelting (ESR), forging, blooming, and wire rod rolling processes. In the present study, electro-slag rapid remelting (ESRR) process, an advanced ESR process combined by continuous casting, was successfully employed to fabricate HSS billets of M2, M35, and M42 steels. Distribution and structure of eutectic carbides of the billets were analysed and cleanliness, hardness, and composition profile of the billets were also evaluated.Keywords: high speed tool steel, eutectic carbide, microstructure, hardness, fracture toughness
Procedia PDF Downloads 4455671 A Rapid Prototyping Tool for Suspended Biofilm Growth Media
Authors: Erifyli Tsagkari, Stephanie Connelly, Zhaowei Liu, Andrew McBride, William Sloan
Abstract:
Biofilms play an essential role in treating water in biofiltration systems. The biofilm morphology and function are inextricably linked to the hydrodynamics of flow through a filter, and yet engineers rarely explicitly engineer this interaction. We develop a system that links computer simulation and 3-D printing to optimize and rapidly prototype filter media to optimize biofilm function with the hypothesis that biofilm function is intimately linked to the flow passing through the filter. A computational model that numerically solves the incompressible time-dependent Navier Stokes equations coupled to a model for biofilm growth and function is developed. The model is imbedded in an optimization algorithm that allows the model domain to adapt until criteria on biofilm functioning are met. This is applied to optimize the shape of filter media in a simple flow channel to promote biofilm formation. The computer code links directly to a 3-D printer, and this allows us to prototype the design rapidly. Its validity is tested in flow visualization experiments and by microscopy. As proof of concept, the code was constrained to explore a small range of potential filter media, where the medium acts as an obstacle in the flow that sheds a von Karman vortex street that was found to enhance the deposition of bacteria on surfaces downstream. The flow visualization and microscopy in the 3-D printed realization of the flow channel validated the predictions of the model and hence its potential as a design tool. Overall, it is shown that the combination of our computational model and the 3-D printing can be effectively used as a design tool to prototype filter media to optimize biofilm formation.Keywords: biofilm, biofilter, computational model, von karman vortices, 3-D printing.
Procedia PDF Downloads 1425670 Calibration and Validation of the Aquacrop Model for Simulating Growth and Yield of Rain-Fed Sesame (Sesamum Indicum L.) Under Different Soil Fertility Levels in the Semi-arid Areas of Tigray, Ethiopia
Authors: Abadi Berhane, Walelign Worku, Berhanu Abrha, Gebre Hadgu
Abstract:
Sesame is an important oilseed crop in Ethiopia, which is the second most exported agricultural commodity next to coffee. However, there is poor soil fertility management and a research-led farming system for the crop. The AquaCrop model was applied as a decision-support tool, which performs a semi-quantitative approach to simulate the yield of crops under different soil fertility levels. The objective of this experiment was to calibrate and validate the AquaCrop model for simulating the growth and yield of sesame under different nitrogen fertilizer levels and to test the performance of the model as a decision-support tool for improved sesame cultivation in the study area. The experiment was laid out as a randomized complete block design (RCBD) in a factorial arrangement in the 2016, 2017, and 2018 main cropping seasons. In this experiment, four nitrogen fertilizer rates, 0, 23, 46, and 69 Kg/ha nitrogen, and three improved varieties (Setit-1, Setit-2, and Humera-1). In the meantime, growth, yield, and yield components of sesame were collected from each treatment. Coefficient of determination (R2), Root mean square error (RMSE), Normalized root mean square error (N-RMSE), Model efficiency (E), and Degree of agreement (D) were used to test the performance of the model. The results indicated that the AquaCrop model successfully simulated soil water content with R2 varying from 0.92 to 0.98, RMSE 6.5 to 13.9 mm, E 0.78 to 0.94, and D 0.95 to 0.99, and the corresponding values for AB also varied from 0.92 to 0.98, 0.33 to 0.54 tons/ha, 0.74 to 0.93, and 0.9 to 0.98, respectively. The results on the canopy cover of sesame also showed that the model acceptably simulated canopy cover with R2 varying from 0.95 to 0.99 and a RMSE of 5.3 to 8.6%. The AquaCrop model was appropriately calibrated to simulate soil water content, canopy cover, aboveground biomass, and sesame yield; the results indicated that the model adequately simulated the growth and yield of sesame under the different nitrogen fertilizer levels. The AquaCrop model might be an important tool for improved soil fertility management and yield enhancement strategies of sesame. Hence, the model might be applied as a decision-support tool in soil fertility management in sesame production.Keywords: aquacrop model, normalized water productivity, nitrogen fertilizer, canopy cover, sesame
Procedia PDF Downloads 79