Search results for: data mining techniques
28455 Optimizing Machine Learning Through Python Based Image Processing Techniques
Authors: Srinidhi. A, Naveed Ahmed, Twinkle Hareendran, Vriksha Prakash
Abstract:
This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models.Keywords: image processing, machine learning applications, template matching, emotion detection
Procedia PDF Downloads 1528454 Modeling Residual Modulus of Elasticity of Self-Compacted Concrete Using Artificial Neural Networks
Authors: Ahmed M. Ashteyat
Abstract:
Artificial Neural Network (ANN) models have been widely used in material modeling, inter-correlations, as well as behavior and trend predictions when the nonlinear relationship between system parameters cannot be quantified explicitly and mathematically. In this paper, ANN was used to predict the residual modulus of elasticity (RME) of self compacted concrete (SCC) damaged by heat. The ANN model was built, trained, tested and validated using a total of 112 experimental data sets, gathered from available literature. The data used in model development included temperature, relative humidity conditions, mix proportions, filler types, and fiber type. The result of ANN training, testing, and validation indicated that the RME of SCC, exposed to different temperature and relative humidity levels, could be predicted accurately with ANN techniques. The reliability between the predicated outputs and the actual experimental data was 99%. This show that ANN has strong potential as a feasible tool for predicting residual elastic modulus of SCC damaged by heat within the range of input parameter. The ANN model could be used to estimate the RME of SCC, as a rapid inexpensive substitute for the much more complicated and time consuming direct measurement of the RME of SCC.Keywords: residual modulus of elasticity, artificial neural networks, self compacted-concrete, material modeling
Procedia PDF Downloads 53428453 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 12128452 EEG Signal Processing Methods to Differentiate Mental States
Authors: Sun H. Hwang, Young E. Lee, Yunhan Ga, Gilwon Yoon
Abstract:
EEG is a very complex signal with noises and other bio-potential interferences. EOG is the most distinct interfering signal when EEG signals are measured and analyzed. It is very important how to process raw EEG signals in order to obtain useful information. In this study, the EEG signal processing techniques such as EOG filtering and outlier removal were examined to minimize unwanted EOG signals and other noises. The two different mental states of resting and focusing were examined through EEG analysis. A focused state was induced by letting subjects to watch a red dot on the white screen. EEG data for 32 healthy subjects were measured. EEG data after 60-Hz notch filtering were processed by a commercially available EOG filtering and our presented algorithm based on the removal of outliers. The ratio of beta wave to theta wave was used as a parameter for determining the degree of focusing. The results show that our algorithm was more appropriate than the existing EOG filtering.Keywords: EEG, focus, mental state, outlier, signal processing
Procedia PDF Downloads 28328451 Historical Studies on Gilt Decorations on Glazed Surfaces
Authors: Sabra Saeidi
Abstract:
This research focuses on the historical techniques associated with the lajevardina and Haft-Rangi production methods in creating tiles, with emphasis on the identification of the techniques of inserting gold sheets on the surface of such historical glazed tiles. In this regard, firstly, the history of the production of enamel, gold plated, and Lajevardina glazed pottery work made during the Khwarizmanshahid and Mongol era (eleventh to the thirteenth century) have been assessed to reach a better understanding of the background and the history associated with historical glazing methods. After the historical overview of the production technique of glazed pottery work and introductions of the civilizations using those techniques, we focused on the niches production methods of enamel and Lajevardina glazing, which are two categories of decorations usually found in tiles. Next, a general classification method for various types of gilt tiles has been introduced, which is applicable to the tile works up to Safavid period (Sixteenth to the seventeenth century). Gilded lajevardina glazed tiles, gilt Haft-Rangi tiles, monolithic glazed gilt tiles, and gilt mosaic tiles are included in the categories.Keywords: gilt tiles, Islamic art, Iranian art, historical studies, gilding
Procedia PDF Downloads 12328450 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 43128449 Teaching Kindness as Moral Virtue in Preschool Children: The Effectiveness of Picture-Storybook Reading and Hand-Puppet Storytelling
Authors: Rose Mini Agoes Salim, Shahnaz Safitri
Abstract:
The aim of this study is to test the effectiveness of teaching kindness in preschool children by using several techniques. Kindness is a physical act or emotional support aimed to build or maintain relationships with others. Kindness is known to be essential in the development of moral reasoning to distinguish between the good and bad things. In this study, kindness is operationalized as several acts including helping friends, comforting sad friends, inviting friends to play, protecting others, sharing, saying hello, saying thank you, encouraging others, and apologizing. It is mentioned that kindness is crucial to be developed in preschool children because this is the time the children begin to interact with their social environment through play. Furthermore, preschool children's cognitive development makes them begin to represent the world with words, which then allows them to interact with others. On the other hand, preschool children egocentric thinking makes them still need to learn to consider another person's perspective. In relation to social interaction, preschool children need to be stimulated and assisted by adult to be able to pay attention to other and act with kindness toward them. On teaching kindness to children, the quality of interaction between children and their significant others is the key factor. It is known that preschool children learn about kindness by imitating adults on their two way interaction. Specifically, this study examines two types of teaching techniques that can be done by parents as a way to teach kindness, namely the picture-storybook reading and hand-puppet storytelling. These techniques were examined because both activities are easy to do and both also provide a model of behavior for the child based on the character in the story. To specifically examine those techniques effectiveness in teaching kindness, two studies were conducted. Study I involves 31 children aged 5-6 years old with picture-storybook reading technique, where the intervention is done by reading 8 picture books for 8 days. In study II, hand-puppet storytelling technique is examined to 32 children aged 3-5 years old. The treatments effectiveness are measured using an instrument in the form of nine colored cards that describe the behavior of kindness. Data analysis using Wilcoxon Signed-rank test shows a significant difference on the average score of kindness (p < 0.05) before and after the intervention has been held. For daily observation, a ‘kindness tree’ and observation sheets are used which are filled out by the teacher. Two weeks after interventions, an improvement on all kindness behaviors measured is intact. The same result is also gained from both ‘kindness tree’ and observational sheets.Keywords: kindness, moral teaching, storytelling, hand puppet
Procedia PDF Downloads 25228448 Negative Sequence-Based Protection Techniques for Microgrid Connected Power Systems
Authors: Isabelle Snyder, Travis Smith
Abstract:
Microgrid protection presents challenges to conventional protection techniques due to the low-induced fault current. Protection relays present in microgrid applications require a combination of settings groups to adjust based on the architecture of the microgrid in islanded and grid-connected modes. In a radial system where the microgrid is at the other end of the feeder, directional elements can be used to identify the direction of the fault current and switch settings groups accordingly (grid-connected or microgrid-connected). However, with multiple microgrid connections, this concept becomes more challenging, and the direction of the current alone is not sufficient to identify the source of the fault current contribution. ORNL has previously developed adaptive relaying schemes through other DOE-funded research projects that will be evaluated and used as a baseline for this research. The four protection techniques in this study are labeled as follows: (1) Adaptive Current only Protection System (ACPS), Intentional (2) Unbalanced Control for Protection Control (IUCPC), (3) Adaptive Protection System with Communication Controller (APSCC) (4) Adaptive Model-Driven Protective Relay (AMDPR).Keywords: adaptive relaying, microgrid protection, sequence components, islanding detection
Procedia PDF Downloads 9728447 Machine Learning Algorithms for Rocket Propulsion
Authors: Rômulo Eustáquio Martins de Souza, Paulo Alexandre Rodrigues de Vasconcelos Figueiredo
Abstract:
In recent years, there has been a surge in interest in applying artificial intelligence techniques, particularly machine learning algorithms. Machine learning is a data-analysis technique that automates the creation of analytical models, making it especially useful for designing complex situations. As a result, this technology aids in reducing human intervention while producing accurate results. This methodology is also extensively used in aerospace engineering since this is a field that encompasses several high-complexity operations, such as rocket propulsion. Rocket propulsion is a high-risk operation in which engine failure could result in the loss of life. As a result, it is critical to use computational methods capable of precisely representing the spacecraft's analytical model to guarantee its security and operation. Thus, this paper describes the use of machine learning algorithms for rocket propulsion to aid the realization that this technique is an efficient way to deal with challenging and restrictive aerospace engineering activities. The paper focuses on three machine-learning-aided rocket propulsion applications: set-point control of an expander-bleed rocket engine, supersonic retro-propulsion of a small-scale rocket, and leak detection and isolation on rocket engine data. This paper describes the data-driven methods used for each implementation in depth and presents the obtained results.Keywords: data analysis, modeling, machine learning, aerospace, rocket propulsion
Procedia PDF Downloads 11528446 Resource Sharing Issues of Distributed Systems Influences on Healthcare Sector Concurrent Environment
Authors: Soo Hong Da, Ng Zheng Yao, Burra Venkata Durga Kumar
Abstract:
The Healthcare sector is a business that consists of providing medical services, manufacturing medical equipment and drugs as well as providing medical insurance to the public. Most of the time, the data stored in the healthcare database is to be related to patient’s information which is required to be accurate when it is accessed by authorized stakeholders. In distributed systems, one important issue is concurrency in the system as it ensures the shared resources to be synchronized and remains consistent through multiple read and write operations by multiple clients. The problems of concurrency in the healthcare sector are who gets the access and how the shared data is synchronized and remains consistent when there are two or more stakeholders attempting to the shared data simultaneously. In this paper, a framework that is beneficial to distributed healthcare sector concurrent environment is proposed. In the proposed framework, four different level nodes of the database, which are national center, regional center, referral center, and local center are explained. Moreover, the frame synchronization is not symmetrical. There are two synchronization techniques, which are complete and partial synchronization operation are explained. Furthermore, when there are multiple clients accessed at the same time, synchronization types are also discussed with cases at different levels and priorities to ensure data is synchronized throughout the processes.Keywords: resources, healthcare, concurrency, synchronization, stakeholders, database
Procedia PDF Downloads 15028445 Traditional Sustainable Architecture Techniques and Its Applications in Contemporary Architecture: Case Studies of the Islamic House in Fatimid Cairo and Sana'a, Cities in Egypt and Yemen
Authors: Ahmed S. Attia
Abstract:
This paper includes a study of modern sustainable architectural techniques and elements that are originally found in vernacular and traditional architecture, particularly in the Arab region. Courtyards, Wind Catchers, and Mashrabiya, for example, are elements that have been developed in contemporary architecture using modern technology to create sustainable architecture designs. An analytical study of the topic will deal with some examples of the Islamic House in Fatimid Cairo city in Egypt, analyzing its elements and their relationship to the environment, in addition to the examples in southern Egypt (Nubba) of sustainable architecture systems, and traditional houses in Sana'a city, Yemen, using earth resources of mud bricks and other construction materials. In conclusion, a comparative study between traditional and contemporary techniques will be conducted to confirm that it is possible to achieve sustainable architecture through the use of low-technology in buildings in Arab regions.Keywords: Islamic context, cultural environment, natural environment, Islamic house, low-technology, mud brick, vernacular and traditional architecture
Procedia PDF Downloads 29928444 Deep Learning Based Polarimetric SAR Images Restoration
Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli
Abstract:
In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry
Procedia PDF Downloads 9028443 Pattern Discovery from Student Feedback: Identifying Factors to Improve Student Emotions in Learning
Authors: Angelina A. Tzacheva, Jaishree Ranganathan
Abstract:
Interest in (STEM) Science Technology Engineering Mathematics education especially Computer Science education has seen a drastic increase across the country. This fuels effort towards recruiting and admitting a diverse population of students. Thus the changing conditions in terms of the student population, diversity and the expected teaching and learning outcomes give the platform for use of Innovative Teaching models and technologies. It is necessary that these methods adapted should also concentrate on raising quality of such innovations and have positive impact on student learning. Light-Weight Team is an Active Learning Pedagogy, which is considered to be low-stake activity and has very little or no direct impact on student grades. Emotion plays a major role in student’s motivation to learning. In this work we use the student feedback data with emotion classification using surveys at a public research institution in the United States. We use Actionable Pattern Discovery method for this purpose. Actionable patterns are patterns that provide suggestions in the form of rules to help the user achieve better outcomes. The proposed method provides meaningful insight in terms of changes that can be incorporated in the Light-Weight team activities, resources utilized in the course. The results suggest how to enhance student emotions to a more positive state, in particular focuses on the emotions ‘Trust’ and ‘Joy’.Keywords: actionable pattern discovery, education, emotion, data mining
Procedia PDF Downloads 9828442 Information Visualization Methods Applied to Nanostructured Biosensors
Authors: Osvaldo N. Oliveira Jr.
Abstract:
The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique
Procedia PDF Downloads 33728441 An Empirical Study to Predict Myocardial Infarction Using K-Means and Hierarchical Clustering
Authors: Md. Minhazul Islam, Shah Ashisul Abed Nipun, Majharul Islam, Md. Abdur Rakib Rahat, Jonayet Miah, Salsavil Kayyum, Anwar Shadaab, Faiz Al Faisal
Abstract:
The target of this research is to predict Myocardial Infarction using unsupervised Machine Learning algorithms. Myocardial Infarction Prediction related to heart disease is a challenging factor faced by doctors & hospitals. In this prediction, accuracy of the heart disease plays a vital role. From this concern, the authors have analyzed on a myocardial dataset to predict myocardial infarction using some popular Machine Learning algorithms K-Means and Hierarchical Clustering. This research includes a collection of data and the classification of data using Machine Learning Algorithms. The authors collected 345 instances along with 26 attributes from different hospitals in Bangladesh. This data have been collected from patients suffering from myocardial infarction along with other symptoms. This model would be able to find and mine hidden facts from historical Myocardial Infarction cases. The aim of this study is to analyze the accuracy level to predict Myocardial Infarction by using Machine Learning techniques.Keywords: Machine Learning, K-means, Hierarchical Clustering, Myocardial Infarction, Heart Disease
Procedia PDF Downloads 20328440 Implications of Circular Economy on Users Data Privacy: A Case Study on Android Smartphones Second-Hand Market
Authors: Mariia Khramova, Sergio Martinez, Duc Nguyen
Abstract:
Modern electronic devices, particularly smartphones, are characterised by extremely high environmental footprint and short product lifecycle. Every year manufacturers release new models with even more superior performance, which pushes the customers towards new purchases. As a result, millions of devices are being accumulated in the urban mine. To tackle these challenges the concept of circular economy has been introduced to promote repair, reuse and recycle of electronics. In this case, electronic devices, that previously ended up in landfills or households, are getting the second life, therefore, reducing the demand for new raw materials. Smartphone reuse is gradually gaining wider adoption partly due to the price increase of flagship models, consequently, boosting circular economy implementation. However, along with reuse of communication device, circular economy approach needs to ensure the data of the previous user have not been 'reused' together with a device. This is especially important since modern smartphones are comparable with computers in terms of performance and amount of data stored. These data vary from pictures, videos, call logs to social security numbers, passport and credit card details, from personal information to corporate confidential data. To assess how well the data privacy requirements are followed on smartphones second-hand market, a sample of 100 Android smartphones has been purchased from IT Asset Disposition (ITAD) facilities responsible for data erasure and resell. Although devices should not have stored any user data by the time they leave ITAD, it has been possible to retrieve the data from 19% of the sample. Applied techniques varied from manual device inspection to sophisticated equipment and tools. These findings indicate significant barrier in implementation of circular economy and a limitation of smartphone reuse. Therefore, in order to motivate the users to donate or sell their old devices and make electronic use more sustainable, data privacy on second-hand smartphone market should be significantly improved. Presented research has been carried out in the framework of sustainablySMART project, which is part of Horizon 2020 EU Framework Programme for Research and Innovation.Keywords: android, circular economy, data privacy, second-hand phones
Procedia PDF Downloads 12828439 Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm
Authors: K. Hema Shankari, R. Thirumalaiselvi, N. V. Balasubramanian
Abstract:
The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.Keywords: APFD metric, genetic algorithm, regression testing, RFT tool, test case prioritization, selenium tool
Procedia PDF Downloads 43628438 Monitoring the Pollution Status of the Goan Coast Using Genotoxicity Biomarkers in the Bivalve, Meretrix ovum
Authors: Avelyno D'Costa, S. K. Shyama, M. K. Praveen Kumar
Abstract:
The coast of Goa, India receives constant anthropogenic stress through its major rivers which carry mining rejects of iron and manganese ores from upstream mining sites and petroleum hydrocarbons from shipping and harbor-related activities which put the aquatic fauna such as bivalves at risk. The present study reports the pollution status of the Goan coast by the above xenobiotics employing genotoxicity studies. This is further supplemented by the quantification of total petroleum hydrocarbons (TPHs) and various trace metals (iron, manganese, copper, cadmium, and lead) in gills of the estuarine clam, Meretrix ovum as well as from the surrounding water and sediment, over a two-year sampling period, from January 2013 to December 2014. Bivalves were collected from a probable unpolluted site at Palolem and a probable polluted site at Vasco, based upon the anthropogenic activities at these sites. Genotoxicity was assessed in the gill cells using the comet assay and micronucleus test. The quantity of TPHs and trace metals present in gill tissue, water and sediments were analyzed using spectrofluorometry and atomic absorption spectrophotometry (AAS), respectively. The statistical significance of data was analyzed employing Student’s t-test. The relationship between DNA damage and pollutant concentrations was evaluated using multiple regression analysis. Significant DNA damage was observed in the bivalves collected from Vasco which is a region of high industrial activity. Concentrations of TPHs and trace metals (iron, manganese, and cadmium) were also found to be significantly high in gills of the bivalves collected from Vasco compared to those collected from Palolem. Further, the concentrations of these pollutants were also found to be significantly high in the water and sediments at Vasco compared to that of Palolem. This may be due to the lack of industrial activity at Palolem. A high positive correlation was observed between the pollutant levels and DNA damage in the bivalves collected from Vasco suggesting the genotoxic nature of these pollutants. Further, M. ovum can be used as a bioindicator species for monitoring the level of pollution of the estuarine/coastal regions by TPHs and trace metals.Keywords: comet assay, metals, micronucleus test, total petroleum Hydrocarbons
Procedia PDF Downloads 23728437 Image Segmentation with Deep Learning of Prostate Cancer Bone Metastases on Computed Tomography
Authors: Joseph M. Rich, Vinay A. Duddalwar, Assad A. Oberai
Abstract:
Prostate adenocarcinoma is the most common cancer in males, with osseous metastases as the commonest site of metastatic prostate carcinoma (mPC). Treatment monitoring is based on the evaluation and characterization of lesions on multiple imaging studies, including Computed Tomography (CT). Monitoring of the osseous disease burden, including follow-up of lesions and identification and characterization of new lesions, is a laborious task for radiologists. Deep learning algorithms are increasingly used to perform tasks such as identification and segmentation for osseous metastatic disease and provide accurate information regarding metastatic burden. Here, nnUNet was used to produce a model which can segment CT scan images of prostate adenocarcinoma vertebral bone metastatic lesions. nnUNet is an open-source Python package that adds optimizations to deep learning-based UNet architecture but has not been extensively combined with transfer learning techniques due to the absence of a readily available functionality of this method. The IRB-approved study data set includes imaging studies from patients with mPC who were enrolled in clinical trials at the University of Southern California (USC) Health Science Campus and Los Angeles County (LAC)/USC medical center. Manual segmentation of metastatic lesions was completed by an expert radiologist Dr. Vinay Duddalwar (20+ years in radiology and oncologic imaging), to serve as ground truths for the automated segmentation. Despite nnUNet’s success on some medical segmentation tasks, it only produced an average Dice Similarity Coefficient (DSC) of 0.31 on the USC dataset. DSC results fell in a bimodal distribution, with most scores falling either over 0.66 (reasonably accurate) or at 0 (no lesion detected). Applying more aggressive data augmentation techniques dropped the DSC to 0.15, and reducing the number of epochs reduced the DSC to below 0.1. Datasets have been identified for transfer learning, which involve balancing between size and similarity of the dataset. Identified datasets include the Pancreas data from the Medical Segmentation Decathlon, Pelvic Reference Data, and CT volumes with multiple organ segmentations (CT-ORG). Some of the challenges of producing an accurate model from the USC dataset include small dataset size (115 images), 2D data (as nnUNet generally performs better on 3D data), and the limited amount of public data capturing annotated CT images of bone lesions. Optimizations and improvements will be made by applying transfer learning and generative methods, including incorporating generative adversarial networks and diffusion models in order to augment the dataset. Performance with different libraries, including MONAI and custom architectures with Pytorch, will be compared. In the future, molecular correlations will be tracked with radiologic features for the purpose of multimodal composite biomarker identification. Once validated, these models will be incorporated into evaluation workflows to optimize radiologist evaluation. Our work demonstrates the challenges of applying automated image segmentation to small medical datasets and lays a foundation for techniques to improve performance. As machine learning models become increasingly incorporated into the workflow of radiologists, these findings will help improve the speed and accuracy of vertebral metastatic lesions detection.Keywords: deep learning, image segmentation, medicine, nnUNet, prostate carcinoma, radiomics
Procedia PDF Downloads 9628436 Optimizing Energy Efficiency: Leveraging Big Data Analytics and AWS Services for Buildings and Industries
Authors: Gaurav Kumar Sinha
Abstract:
In an era marked by increasing concerns about energy sustainability, this research endeavors to address the pressing challenge of energy consumption in buildings and industries. This study delves into the transformative potential of AWS services in optimizing energy efficiency. The research is founded on the recognition that effective management of energy consumption is imperative for both environmental conservation and economic viability. Buildings and industries account for a substantial portion of global energy use, making it crucial to develop advanced techniques for analysis and reduction. This study sets out to explore the integration of AWS services with big data analytics to provide innovative solutions for energy consumption analysis. Leveraging AWS's cloud computing capabilities, scalable infrastructure, and data analytics tools, the research aims to develop efficient methods for collecting, processing, and analyzing energy data from diverse sources. The core focus is on creating predictive models and real-time monitoring systems that enable proactive energy management. By harnessing AWS's machine learning and data analytics capabilities, the research seeks to identify patterns, anomalies, and optimization opportunities within energy consumption data. Furthermore, this study aims to propose actionable recommendations for reducing energy consumption in buildings and industries. By combining AWS services with metrics-driven insights, the research strives to facilitate the implementation of energy-efficient practices, ultimately leading to reduced carbon emissions and cost savings. The integration of AWS services not only enhances the analytical capabilities but also offers scalable solutions that can be customized for different building and industrial contexts. The research also recognizes the potential for AWS-powered solutions to promote sustainable practices and support environmental stewardship.Keywords: energy consumption analysis, big data analytics, AWS services, energy efficiency
Procedia PDF Downloads 6428435 Seismic Inversion for Geothermal Exploration
Authors: E. N. Masri, E. Takács
Abstract:
Amplitude Versus Offset (AVO) and simultaneous model-based impedance inversion techniques have not been utilized for geothermal exploration commonly; however, some recent publications called the attention that they can be very useful in the geothermal investigations. In this study, we present rock physical attributes obtained from 3D pre-stack seismic data and well logs collected in a study area of the NW part of Pannonian Basin where the geothermal reservoir is located in the fractured zones of Triassic basement and it was hit by three productive-injection well pairs. The holes were planned very successfully based on the conventional 3D migrated stack volume prior to this study. Subsequently, the available geophysical-geological datasets provided a great opportunity to test modern inversion procedures in the same area. In this presentation, we provide a summary of the theory and application of the most promising seismic inversion techniques from the viewpoint of geothermal exploration. We demonstrate P- and S-wave impedance, as well as the velocity (Vp and Vs), the density, and the Vp/Vs ratio attribute volumes calculated from the seismic and well-logging data sets. After a detailed discussion, we conclude that P-wave impedance and Vp/Vp ratio are the most helpful parameters for lithology discrimination in the study area. They detect the hot water saturated fracture zone very well thus they can be very useful in mapping the investigated reservoir. Integrated interpretation of all the obtained rock-physical parameters is essential. We are extending the above discussed pre-stack seismic tools by studying the possibilities of Elastic Impedance Inversion (EII) for geothermal exploration. That procedure provides two other useful rock-physical properties, the compressibility and the rigidity (Lamé parameters). Results of those newly created elastic parameters will also be demonstrated in the presentation. Geothermal extraction is of great interest nowadays; and we can adopt several methods have been successfully applied in the hydrocarbon exploration for decades to discover new reservoirs and reduce drilling risk and cost.Keywords: fractured zone, seismic, well-logging, inversion
Procedia PDF Downloads 12628434 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques
Authors: Jonathan J. Burson
Abstract:
With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis
Procedia PDF Downloads 9728433 Anticorrosive Properties of Poly(O-Phenylendiamine)/ZnO Nanocomposites Coated Stainless Steel
Authors: Aisha Ganash
Abstract:
Poly(o-phenylendiamine) and poly(ophenylendiamine)/ZnO(PoPd/ZnO) nanocomposites coating were prepared on type-304 austenitic stainless steel (SS) using H2SO4 acid as electrolyte by potentiostatic methods. Fourier transforms infrared spectroscopy and scanning electron microscopy techniques were used to characterize the composition and structure of PoPd/ZnO nanocomposites. The corrosion protection of polymer coatings ability was studied by Eocp-time measurement, anodic and cathodic potentiodynamic polarization and Impedance techniques in 3.5% NaCl as a corrosive solution. It was found that ZnO nanoparticles improve the barrier and electrochemical anticorrosive properties of poly(o-phenylendiamine).Keywords: anticorrosion, conducting polymers, electrochemistry, nanocomposites
Procedia PDF Downloads 29228432 A Methodology for Automatic Diversification of Document Categories
Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim
Abstract:
Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.Keywords: big data analysis, document classification, multi-category, text mining, topic analysis
Procedia PDF Downloads 27228431 Multi-Criteria Decision Approach to Performance Measurement Techniques Data Envelopment Analysis: Case Study of Kerman City’s Parks
Authors: Ali A. Abdollahi
Abstract:
During the last several decades, scientists have consistently applied Multiple Criteria Decision-Making methods in making decisions about multi-faceted, complicated subjects. While making such decisions and in order to achieve more accurate evaluations, they have regularly used a variety of criteria instead of applying just one Optimum Evaluation Criterion. The method presented here utilizes both ‘quantity’ and ‘quality’ to assess the function of the Multiple-Criteria method. Applying Data envelopment analysis (DEA), weighted aggregated sum product assessment (WASPAS), Weighted Sum Approach (WSA), Analytic Network Process (ANP), and Charnes, Cooper, Rhodes (CCR) methods, we have analyzed thirteen parks in Kerman city. It further indicates that the functions of WASPAS and WSA are compatible with each other, but also that their deviation from DEA is extensive. Finally, the results for the CCR technique do not match the results of the DEA technique. Our study indicates that the ANP method, with the average rate of 1/51, ranks closest to the DEA method, which has an average rate of 1/49.Keywords: multiple criteria decision making, Data envelopment analysis (DEA), Charnes Cooper Rhodes (CCR), Weighted Sum Approach (WSA)
Procedia PDF Downloads 21828430 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis
Authors: Mayada Attia Ibrahim
Abstract:
Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis
Procedia PDF Downloads 9728429 Beliefs on Reproduction of Women in Fish Port Community: An Explorative Study on the Beliefs on Conception, Childbirth, and Maternal Care of Women in Navotas Fish Port Community
Authors: Marie Kristel A. Gabawa
Abstract:
The accessibility of health programs, specifically family planning programs and maternal and child health care (FP/MCH), are generally low in urban poor communities. Moreover, most of FP/MCH programs are directed toward medical terms that are usually not included in ideation of the body of urban poor dwellers. This study aims to explore the beliefs on reproduction that will encompass, but not limited to, beliefs on conception, pregnancy, and maternal and child health care. The site of study will be the 2 barangays of North Bay Boulevard South 1 (NBBS1) and North Bay Boulevard South 2 (NBBS2). These 2 barangays are the nearest residential community within the Navotas Fish Port Complex (NFPC). Data gathered will be analyzed using grounded-theory method of analysis, with the theories of cultural materialism and equity feminism as foundation. Survey questionnaires, key informant interviews, and focus group discussions will be utilized in gathering data. Further, the presentation of data will be recommended to health program initiators and use the data gathered as a tool to customize FP/MCH programs to the perception and beliefs of women residing in NBBS1and NBBS2, and to aid any misinformation for FP/MCH techniques.Keywords: beliefs on reproduction, fish port community, family planning, maternal and child health care, Navotas
Procedia PDF Downloads 25928428 Analysis of the Effect of Farmers’ Socio-Economic Factors on Net Farm Income of Catfish Farmers in Kwara State, Nigeria
Authors: Olanike A. Ojo, Akindele M. Ojo, Jacob H. Tsado, Ramatu U. Kutigi
Abstract:
The study was carried out on analysis of the effect of farmers’ socio-economic factors on the net farm income of catfish farmers in Kwara State, Nigeria. Primary data were collected from selected catfish farmers with the aid of well-structured questionnaire and a multistage sampling technique was used to select 102 catfish farmers in the area. The analytical techniques involved the use of descriptive statistics and multiple regression analysis. The findings of the analysis of socio-economic characteristics of catfish farmers reveal that 60% of the catfish farmers in the study area were male gender which implied the existence of gender inequality in the area. The mean age of 47 years was an indication that they were at their economically productive age and could contribute positively to increased production of catfish in the area. Also, the mean household size was five while the mean year of experience was five. The latter implied that the farmers were experienced in fishing techniques, breeding and fish culture which would assist in generating more revenue, reduce cost of production and eventual increase in profit levels of the farmers. The result also revealed that stock capacity (X3), accessibility to credit (X7) and labour (X4) were the main determinants of catfish production in the area. In addition, farmer’s sex, household size, no of ponds, distance of the farm from market, access to credit were the main socio-economic factors influencing the net farm income of the catfish farmers in the area. The most serious constraints militating against catfish production in the study area were high mortality rate, insufficient market, inadequate credit facilities/ finance and inadequate skilled labour needed for daily production routine. Based on the findings, it is therefore recommended that, to reduce the mortality rate of catfish extension agents should organize training workshops on improved methods and techniques of raising catfish right from juvenile to market size.Keywords: credit, income, stock, mortality
Procedia PDF Downloads 33228427 Learning about the Strengths and Weaknesses of Urban Climate Action Plans
Authors: Prince Dacosta Aboagye, Ayyoob Sharifi
Abstract:
Cities respond to climate concerns mainly through their climate action plans (CAPs). A comprehensive content analysis of the dynamics in existing urban CAPs is not well represented in the literature. This literature void presents a difficulty in appreciating the strengths and weaknesses of urban CAPs. Here, we perform a qualitative content analysis (QCA) on CAPs from 278 cities worldwide and use text-mining tools to map and visualize the relevant data. Our analysis showed a decline in the number of CAPs developed and published following the global COVID-19 lockdown period. Evidently, megacities are leading the deep decarbonisation agenda. We also observed a transition from developing mainly mitigation-focused CAPs pre-COP21 to both mitigation and adaptation CAPs. A lack of inclusiveness in local climate planning was common among European and North American cities. The evidence is a catalyst for understanding the trends in existing urban CAPs to shape future urban climate planning.Keywords: urban, climate action plans, strengths, weaknesses
Procedia PDF Downloads 9728426 Application of Blockchain Technology in Geological Field
Authors: Mengdi Zhang, Zhenji Gao, Ning Kang, Rongmei Liu
Abstract:
Management and application of geological big data is an important part of China's national big data strategy. With the implementation of a national big data strategy, geological big data management becomes more and more critical. At present, there are still a lot of technology barriers as well as cognition chaos in many aspects of geological big data management and application, such as data sharing, intellectual property protection, and application technology. Therefore, it’s a key task to make better use of new technologies for deeper delving and wider application of geological big data. In this paper, we briefly introduce the basic principle of blockchain technology at the beginning and then make an analysis of the application dilemma of geological data. Based on the current analysis, we bring forward some feasible patterns and scenarios for the blockchain application in geological big data and put forward serval suggestions for future work in geological big data management.Keywords: blockchain, intellectual property protection, geological data, big data management
Procedia PDF Downloads 91