Search results for: sensor node data processing
26217 Image Denoising Using Spatial Adaptive Mask Filter for Medical Images
Authors: R. Sumalatha, M. V. Subramanyam
Abstract:
In medical image processing the quality of the image is degraded in the presence of noise. Especially in ultra sound imaging and Magnetic resonance imaging the data was corrupted by signal dependent noise known as salt and pepper noise. Removal of noise from the medical images is a critical issue for researchers. In this paper, a new type of technique Adaptive Spatial Mask Filter (ASMF) has been proposed. The proposed filter is used to increase the quality of MRI and ultra sound images. Experimental results show that the proposed filter outperforms the implementation of mean, median, adaptive median filters in terms of MSE and PSNR.Keywords: salt and pepper noise, ASMF, PSNR, MSE
Procedia PDF Downloads 43626216 Interpreting Privacy Harms from a Non-Economic Perspective
Authors: Christopher Muhawe, Masooda Bashir
Abstract:
With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.Keywords: data breach and misuse, economic harms, privacy harms, psychological harms
Procedia PDF Downloads 19526215 Renewable Energy Micro-Grid Control Using Microcontroller in LabVIEW
Authors: Meena Agrawal, Chaitanya P. Agrawal
Abstract:
The power systems are transforming and becoming smarter with innovations in technologies to enable embark simultaneously upon the sustainable energy needs, rising environmental concerns, economic benefits and quality requirements. The advantages provided by inter-connection of renewable energy resources are becoming more viable and dependable with the smart controlling technologies. The limitation of most renewable resources have their diversity and intermittency causing problems in power quality, grid stability, reliability, security etc. is being cured by these efforts. A necessitate of optimal energy management by intelligent Micro-Grids at the distribution end of the power system has been accredited to accommodate sustainable renewable Distributed Energy Resources on large scale across the power grid. All over the world Smart Grids are emerging now as foremost concern infrastructure upgrade programs. The hardware setup includes NI cRIO 9022, Compact Reconfigurable Input Output microcontroller board connected to the PC on a LAN router with three hardware modules. The Real-Time Embedded Controller is reconfigurable controller device consisting of an embedded real-time processor controller for communication and processing, a reconfigurable chassis housing the user-programmable FPGA, Eight hot-swappable I/O modules, and graphical LabVIEW system design software. It has been employed for signal analysis, controls and acquisition and logging of the renewable sources with the LabVIEW Real-Time applications. The employed cRIO chassis controls the timing for the module and handles communication with the PC over the USB, Ethernet, or 802.11 Wi-Fi buses. It combines modular I/O, real-time processing, and NI LabVIEW programmable. In the presented setup, the Analog Input Module NI 9205 five channels have been used for input analog voltage signals from renewable energy sources and NI 9227 four channels have been used for input analog current signals of the renewable sources. For switching actions based on the programming logic developed in software, a module having Electromechanical Relays (single-pole single throw) with 4-Channels, electrically isolated and LED indicating the state of that channel have been used for isolating the renewable Sources on fault occurrence, which is decided by the logic in the program. The module for Ethernet based Data Acquisition Interface ENET 9163 Ethernet Carrier, which is connected on the LAN Router for data acquisition from a remote source over Ethernet also has the module NI 9229 installed. The LabVIEW platform has been employed for efficient data acquisition, monitoring and control. Control logic utilized in program for operation of the hardware switching Related to Fault Relays has been portrayed as a flowchart. A communication system has been successfully developed amongst the sources and loads connected on different computers using Hypertext transfer protocol, HTTP or Ethernet Local Stacked area Network TCP/IP protocol. There are two main I/O interfacing clients controlling the operation of the switching control of the renewable energy sources over internet or intranet. The paper presents experimental results of the briefed setup for intelligent control of the micro-grid for renewable energy sources, besides the control of Micro-Grid with data acquisition and control hardware based on a microcontroller with visual program developed in LabVIEW.Keywords: data acquisition and control, LabVIEW, microcontroller cRIO, Smart Micro-Grid
Procedia PDF Downloads 33326214 An Event-Related Potential Study of Individual Differences in Word Recognition: The Evidence from Morphological Knowledge of Sino-Korean Prefixes
Authors: Jinwon Kang, Seonghak Jo, Joohee Ahn, Junghye Choi, Sun-Young Lee
Abstract:
A morphological priming has proved its importance by showing that segmentation occurs in morphemes when visual words are recognized within a noticeably short time. Regarding Sino-Korean prefixes, this study conducted an experiment on visual masked priming tasks with 57 ms stimulus-onset asynchrony (SOA) to see how individual differences in the amount of morphological knowledge affect morphological priming. The relationship between the prime and target words were classified as morphological (e.g., 미개척 migaecheog [unexplored] – 미해결 mihaegyel [unresolved]), semantical (e.g., 친환경 chinhwangyeong [eco-friendly]) – 무공해 mugonghae [no-pollution]), and orthographical (e.g., 미용실 miyongsil [beauty shop] – 미확보 mihwagbo [uncertainty]) conditions. We then compared the priming by configuring irrelevant paired stimuli for each condition’s control group. As a result, in the behavioral data, we observed facilitatory priming from a group with high morphological knowledge only under the morphological condition. In contrast, a group with low morphological knowledge showed the priming only under the orthographic condition. In the event-related potential (ERP) data, the group with high morphological knowledge presented the N250 only under the morphological condition. The findings of this study imply that individual differences in morphological knowledge in Korean may have a significant influence on the segmental processing of Korean word recognition.Keywords: ERP, individual differences, morphological priming, sino-Korean prefixes
Procedia PDF Downloads 21526213 The Role of Executive Attention and Literacy on Consumer Memory
Authors: Fereshteh Nazeri Bahadori
Abstract:
In today's competitive environment, any company that aims to operate in a market, whether industrial or consumer markets, must know that it cannot address all the tastes and demands of customers at once and serve them all. The study of consumer memory is considered an important subject in marketing research, and many companies have conducted studies on this subject and the factors affecting it due to its importance. Therefore, the current study tries to investigate the relationship between consumers' attention, literacy, and memory. Memory has a very close relationship with learning. Memory is the collection of all the information that we have understood and stored. One of the important subjects in consumer behavior is information processing by the consumer. One of the important factors in information processing is the mental involvement of the consumer, which has attracted a lot of attention in the past two decades. Since consumers are the turning point of all marketing activities, successful marketing begins with understanding why and how consumers behave. Therefore, in the current study, the role of executive attention and literacy on consumers' memory has been investigated. The results showed that executive attention and literacy would play a significant role in the long-term and short-term memory of consumers.Keywords: literacy, consumer memory, executive attention, psychology of consumer behavior
Procedia PDF Downloads 9626212 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course
Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu
Abstract:
This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN
Procedia PDF Downloads 4426211 Detecting Music Enjoyment Level Using Electroencephalogram Signals and Machine Learning Techniques
Authors: Raymond Feng, Shadi Ghiasi
Abstract:
An electroencephalogram (EEG) is a non-invasive technique that records electrical activity in the brain using scalp electrodes. Researchers have studied the use of EEG to detect emotions and moods by collecting signals from participants and analyzing how those signals correlate with their activities. In this study, researchers investigated the relationship between EEG signals and music enjoyment. Participants listened to music while data was collected. During the signal-processing phase, power spectral densities (PSDs) were computed from the signals, and dominant brainwave frequencies were extracted from the PSDs to form a comprehensive feature matrix. A machine learning approach was then taken to find correlations between the processed data and the music enjoyment level indicated by the participants. To improve on previous research, multiple machine learning models were employed, including K-Nearest Neighbors Classifier, Support Vector Classifier, and Decision Tree Classifier. Hyperparameters were used to fine-tune each model to further increase its performance. The experiments showed that a strong correlation exists, with the Decision Tree Classifier with hyperparameters yielding 85% accuracy. This study proves that EEG is a reliable means to detect music enjoyment and has future applications, including personalized music recommendation, mood adjustment, and mental health therapy.Keywords: EEG, electroencephalogram, machine learning, mood, music enjoyment, physiological signals
Procedia PDF Downloads 6226210 Digital Holographic Interferometric Microscopy for the Testing of Micro-Optics
Authors: Varun Kumar, Chandra Shakher
Abstract:
Micro-optical components such as microlenses and microlens array have numerous engineering and industrial applications for collimation of laser diodes, imaging devices for sensor system (CCD/CMOS, document copier machines etc.), for making beam homogeneous for high power lasers, a critical component in Shack-Hartmann sensor, fiber optic coupling and optical switching in communication technology. Also micro-optical components have become an alternative for applications where miniaturization, reduction of alignment and packaging cost are necessary. The compliance with high-quality standards in the manufacturing of micro-optical components is a precondition to be compatible on worldwide markets. Therefore, high demands are put on quality assurance. For quality assurance of these lenses, an economical measurement technique is needed. For cost and time reason, technique should be fast, simple (for production reason), and robust with high resolution. The technique should provide non contact, non-invasive and full field information about the shape of micro- optical component under test. The interferometric techniques are noncontact type and non invasive and provide full field information about the shape of the optical components. The conventional interferometric technique such as holographic interferometry or Mach-Zehnder interferometry is available for characterization of micro-lenses. However, these techniques need more experimental efforts and are also time consuming. Digital holography (DH) overcomes the above described problems. Digital holographic microscopy (DHM) allows one to extract both the amplitude and phase information of a wavefront transmitted through the transparent object (microlens or microlens array) from a single recorded digital hologram by using numerical methods. Also one can reconstruct the complex object wavefront at different depths due to numerical reconstruction. Digital holography provides axial resolution in nanometer range while lateral resolution is limited by diffraction and the size of the sensor. In this paper, Mach-Zehnder based digital holographic interferometric microscope (DHIM) system is used for the testing of transparent microlenses. The advantage of using the DHIM is that the distortions due to aberrations in the optical system are avoided by the interferometric comparison of reconstructed phase with and without the object (microlens array). In the experiment, first a digital hologram is recorded in the absence of sample (microlens array) as a reference hologram. Second hologram is recorded in the presence of microlens array. The presence of transparent microlens array will induce a phase change in the transmitted laser light. Complex amplitude of object wavefront in presence and absence of microlens array is reconstructed by using Fresnel reconstruction method. From the reconstructed complex amplitude, one can evaluate the phase of object wave in presence and absence of microlens array. Phase difference between the two states of object wave will provide the information about the optical path length change due to the shape of the microlens. By the knowledge of the value of the refractive index of microlens array material and air, the surface profile of microlens array is evaluated. The Sag of microlens and radius of curvature of microlens are evaluated and reported. The sag of microlens agrees well within the experimental limit as provided in the specification by the manufacturer.Keywords: micro-optics, microlens array, phase map, digital holographic interferometric microscopy
Procedia PDF Downloads 49926209 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation
Authors: C. Bunsanit
Abstract:
This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.Keywords: fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband
Procedia PDF Downloads 22626208 Characterization of Forest Fire Fuel in Shivalik Himalayas Using Hyperspectral Remote Sensing
Authors: Neha Devi, P. K. Joshi
Abstract:
Fire fuel map is one of the most critical factors for planning and managing the fire hazard and risk. One of the most significant forms of global disturbance, impacting community dynamics, biogeochemical cycles and local and regional climate across a wide range of ecosystems ranging from boreal forests to tropical rainforest is wildfire Assessment of fire danger is a function of forest type, fuelwood stock volume, moisture content, degree of senescence and fire management strategy adopted in the ground. Remote sensing has potential of reduction the uncertainty in mapping fuels. Hyperspectral remote sensing is emerging to be a very promising technology for wildfire fuels characterization. Fine spectral information also facilitates mapping of biophysical and chemical information that is directly related to the quality of forest fire fuels including above ground live biomass, canopy moisture, etc. We used Hyperion imagery acquired in February, 2016 and analysed four fuel characteristics using Hyperion sensor data on-board EO-1 satellite, acquired over the Shiwalik Himalayas covering the area of Champawat, Uttarakhand state. The main objective of this study was to present an overview of methodologies for mapping fuel properties using hyperspectral remote sensing data. Fuel characteristics analysed include fuel biomass, fuel moisture, and fuel condition and fuel type. Fuel moisture and fuel biomass were assessed through the expression of the liquid water bands. Fuel condition and type was assessed using green vegetation, non-photosynthetic vegetation and soil as Endmember for spectral mixture analysis. Linear Spectral Unmixing, a partial spectral unmixing algorithm, was used to identify the spectral abundance of green vegetation, non-photosynthetic vegetation and soil.Keywords: forest fire fuel, Hyperion, hyperspectral, linear spectral unmixing, spectral mixture analysis
Procedia PDF Downloads 16526207 The Cultural and Semantic Danger of English Transparent Words Translated from English into Arabic
Authors: Abdullah Khuwaileh
Abstract:
While teaching and translating vocabulary is no longer a neglected area in ELT in general and in translation in particular, the psychology of its acquisition has been a neglected area. Our paper aims at exploring some of the learning and translating conditions under which vocabulary is acquired and translated properly. To achieve this objective, two teaching methods (experiments) were applied on 4 translators to measure their acquisition of a number of transparent vocabulary items. Some of these items were knowingly chosen from 'deceptively transparent words'. All the data, sample, etc., were taken from Jordan University of Science and Technology (JUST) and Yarmouk University, where the researcher is employed. The study showed that translators might translate transparent words inaccurately, particularly if these words are uncontextualised. It was also shown that the morphological structures of words may lead translators or even EFL learners to misinterpretations of meaning.Keywords: english, transparent, word, processing, translation
Procedia PDF Downloads 7126206 Fault Detection and Isolation in Sensors and Actuators of Wind Turbines
Authors: Shahrokh Barati, Reza Ramezani
Abstract:
Due to the countries growing attention to the renewable energy producing, the demand for energy from renewable energy has gone up among the renewable energy sources; wind energy is the fastest growth in recent years. In this regard, in order to increase the availability of wind turbines, using of Fault Detection and Isolation (FDI) system is necessary. Wind turbines include of various faults such as sensors fault, actuator faults, network connection fault, mechanical faults and faults in the generator subsystem. Although, sensors and actuators have a large number of faults in wind turbine but have discussed fewer in the literature. Therefore, in this work, we focus our attention to design a sensor and actuator fault detection and isolation algorithm and Fault-tolerant control systems (FTCS) for Wind Turbine. The aim of this research is to propose a comprehensive fault detection and isolation system for sensors and actuators of wind turbine based on data-driven approaches. To achieve this goal, the features of measurable signals in real wind turbine extract in any condition. The next step is the feature selection among the extract in any condition. The next step is the feature selection among the extracted features. Features are selected that led to maximum separation networks that implemented in parallel and results of classifiers fused together. In order to maximize the reliability of decision on fault, the property of fault repeatability is used.Keywords: FDI, wind turbines, sensors and actuators faults, renewable energy
Procedia PDF Downloads 40026205 Data Access, AI Intensity, and Scale Advantages
Authors: Chuping Lo
Abstract:
This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.Keywords: digital intensity, digital divide, international trade, scale of economics
Procedia PDF Downloads 6826204 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data
Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju
Abstract:
Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding
Procedia PDF Downloads 41226203 Identity Verification Using k-NN Classifiers and Autistic Genetic Data
Authors: Fuad M. Alkoot
Abstract:
DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN).Keywords: biometrics, genetic data, identity verification, k nearest neighbor
Procedia PDF Downloads 25826202 Effect of Some Metal Ions on the Activity of Lipase Produced by Aspergillus Niger Cultured on Vitellaria Paradoxa Shells
Authors: Abdulhakeem Sulyman, Olukotun Zainab, Hammed Abdulquadri
Abstract:
Lipases (triacylglycerol acyl hydrolases) (EC 3.1.1.3) are class of enzymes that catalyses the hydrolysis of triglycerides to glycerol and free fatty acids. They account for up to 10% of the enzyme in the market and have a wide range of applications in biofuel production, detergent formulation, leather processing and in food and feed processing industry. This research was conducted to study the effect of some metal ions on the activity of purified lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells. Purified lipase in 12.5 mM p-NPL was incubated with different metal ions (Zn²⁺, Ca²⁺, Mn²⁺, Fe²⁺, Na⁺, K⁺ and Mg²⁺). The final concentrations of metal ions investigated were 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 and 1.0 mM. The results obtained from the study showed that Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ ions increased the activity of lipase up to 3.0, 3.0, 1.0, and 26.0 folds respectively. Lipase activity was partially inhibited by Na⁺ and Mg²⁺ with up to 88.5% and 83.7% loss of activity respectively. Lipase activity was also inhibited by K⁺ with up to 56.7% loss in the activity as compared to in the absence of metal ions. The study concluded that lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells can be activated by the presence of Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ and inhibited by Na⁺, K⁺ and Mg²⁺.Keywords: Aspergillus niger, Vitellaria paradoxa, lipase, metal ions
Procedia PDF Downloads 15026201 Detection of Image Blur and Its Restoration for Image Enhancement
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.Keywords: image enhancement, motion analysis, motion detection, motion estimation
Procedia PDF Downloads 28826200 Value Chain Analysis of Melon “Egusi” (Citrullus lanatus Thunb. Mansf) among Rural Farm Enterprises in South East, Nigeria
Authors: Chigozirim Onwusiribe, Jude Mbanasor
Abstract:
Egusi Melon (Citrullus Lanatus Thunb. Mansf ) is a very important oil seed that serves a major ingredient in the diet of most of the households in Nigeria. Egusi Melon is very nutritious and very important in meeting the food security needs of Nigerians. Egusi Melon is cultivated in most farm enterprise in South East Nigeria but the profitability of its value chain needs to be investigated. This study analyzed the profitability of the Egusi Melon value chain. Specifically this study developed a value chain map for Egusi Melon, analysed the profitability of each stage of the Egusi Melon Value chain and analysed the determinants of the profitability of the Egusi Melon at each stage of the value chain. Multi stage sampling technique was used to select 125 farm enterprises with similar capacity and characteristics. Questionnaire and interview were used to elicit the required data while descriptive statistics, Food and Agriculture Organization Value Chain Analysis Tool, profitability ratios and multiple regression analysis were used for the data analysis. One of the findings showed that the stages of the Egusi Melon value chain are very profitable. Based on the findings, we recommend the provision of grants by government and donor agencies to the farm enterprises through their cooperative societies, this will provide the necessary funds for the local fabrication of value addition and processing equipment to suit their unique value addition needs not met by the imported equipment.Keywords: value, chain, melon, farm, enterprises
Procedia PDF Downloads 13626199 Efficient of Technology Remediation Soil That Contaminated by Petroleum Based on Heat without Combustion
Authors: Gavin Hutama Farandiarta, Hegi Adi Prabowo, Istiara Rizqillah Hanifah, Millati Hanifah Saprudin, Raden Iqrafia Ashna
Abstract:
The increase of the petroleum’s consumption rate encourages industries to optimize and increase the activity in processing crude oil into petroleum. However, although the result gives a lot of benefits to humans worldwide, it also gives negative impact to the environment. One of the negative impacts of processing crude oil is the soil will be contaminated by petroleum sewage sludge. This petroleum sewage sludge, contains hydrocarbon compound and it can be calculated by Total Petroleum Hydrocarbon (TPH).Petroleum sludge waste is accounted as hazardous and toxic. The soil contamination caused by the petroleum sludge is very hard to get rid of. However, there is a way to manage the soil that is contaminated by petroleum sludge, which is by using heat (thermal desorption) in the process of remediation. There are several factors that affect the success rate of the remediation with the help of heat which are temperature, time, and air pressure in the desorption column. The remediation process using the help of heat is an alternative in soil recovery from the petroleum pollution which highly effective, cheap, and environmentally friendly that produces uncontaminated soil and the petroleum that can be used again.Keywords: petroleum sewage sludge, remediation soil, thermal desorption, total petroleum hydrocarbon (TPH)
Procedia PDF Downloads 24726198 DeClEx-Processing Pipeline for Tumor Classification
Authors: Gaurav Shinde, Sai Charan Gongiguntla, Prajwal Shirur, Ahmed Hambaba
Abstract:
Health issues are significantly increasing, putting a substantial strain on healthcare services. This has accelerated the integration of machine learning in healthcare, particularly following the COVID-19 pandemic. The utilization of machine learning in healthcare has grown significantly. We introduce DeClEx, a pipeline that ensures that data mirrors real-world settings by incorporating Gaussian noise and blur and employing autoencoders to learn intermediate feature representations. Subsequently, our convolutional neural network, paired with spatial attention, provides comparable accuracy to state-of-the-art pre-trained models while achieving a threefold improvement in training speed. Furthermore, we provide interpretable results using explainable AI techniques. We integrate denoising and deblurring, classification, and explainability in a single pipeline called DeClEx.Keywords: machine learning, healthcare, classification, explainability
Procedia PDF Downloads 5626197 A Review on Intelligent Systems for Geoscience
Authors: R Palson Kennedy, P.Kiran Sai
Abstract:
This article introduces machine learning (ML) researchers to the hurdles that geoscience problems present, as well as the opportunities for improvement in both ML and geosciences. This article presents a review from the data life cycle perspective to meet that need. Numerous facets of geosciences present unique difficulties for the study of intelligent systems. Geosciences data is notoriously difficult to analyze since it is frequently unpredictable, intermittent, sparse, multi-resolution, and multi-scale. The first half addresses data science’s essential concepts and theoretical underpinnings, while the second section contains key themes and sharing experiences from current publications focused on each stage of the data life cycle. Finally, themes such as open science, smart data, and team science are considered.Keywords: Data science, intelligent system, machine learning, big data, data life cycle, recent development, geo science
Procedia PDF Downloads 13526196 Optimal Load Control Strategy in the Presence of Stochastically Dependent Renewable Energy Sources
Authors: Mahmoud M. Othman, Almoataz Y. Abdelaziz, Yasser G. Hegazy
Abstract:
This paper presents a load control strategy based on modification of the Big Bang Big Crunch optimization method. The proposed strategy aims to determine the optimal load to be controlled and the corresponding time of control in order to minimize the energy purchased from substation. The presented strategy helps the distribution network operator to rely on the renewable energy sources in supplying the system demand. The renewable energy sources used in the presented study are modeled using the diagonal band Copula method and sequential Monte Carlo method in order to accurately consider the multivariate stochastic dependence between wind power, photovoltaic power and the system demand. The proposed algorithms are implemented in MATLAB environment and tested on the IEEE 37-node feeder. Several case studies are done and the subsequent discussions show the effectiveness of the proposed algorithm.Keywords: big bang big crunch, distributed generation, load control, optimization, planning
Procedia PDF Downloads 34526195 PYURF and ZED9 Have a Prominent Role in Association with Molecular Pathways in Bortezomib in Myeloma Cells in Acute Myeloid Leukemia
Authors: Atena Sadat Hosseini, Mohammadhossein Habibi
Abstract:
Acute myeloid leukemia (AML) is the most typically diagnosed leukemia. In older adults, AML imposes a dismal outcome. AML originates with a dominant mutation, then adds collaborative, transformative mutations leading to myeloid transformation and clinical/biological heterogeneity. Several chemotherapeutic drugs are used for this cancer. These drugs are naturally associated with several side effects, and finding a more accurate molecular mechanism of these drugs can have a significant impact on the selection and better candidate of drugs for treatment. In this study, we evaluated bortezomibin myeloma cells using bioinformatics analysis and evaluation of RNA-Seq data. Then investigated the molecular pathways proteins- proteins interactions associated with this chemotherapy drug. A total of 658upregulated genes and 548 downregulated genes were sorted.AUF1 (hnRNP D0) binds and destabilizes mRNA, degradation of GLI2 by the proteasome, the role of GTSE1 in G2/M progression after G2 checkpoint, TCF dependent signaling in response to WNT demonstrated in upregulated genes. Besides insulin resistance, AKT phosphorylates targets in the nucleus, cytosine methylation, Longevity regulating pathway, and Signal Transduction of S1P Receptor were related to low expression genes. With respect to this results, HIST2H2AA3, RP11-96O20.4, ZED9, PRDX1, and DOK2, according to node degrees and betweenness elements candidates from upregulated genes. in the opposite side, PYURF, NRSN1, FGF23, UPK3BL, and STAG3 were a prominent role in downregulated genes. Sum up, Using in silico analysis in the present study, we conducted a precise study ofbortezomib molecular mechanisms in myeloma cells. so that we could take further evaluation to discovermolecular cancer therapy. Naturally, more additional experimental and clinical procedures are needed in this survey.Keywords: myeloma cells, acute myeloid leukemia, bioinformatics analysis, bortezomib
Procedia PDF Downloads 9326194 Machine Learning in Agriculture: A Brief Review
Authors: Aishi Kundu, Elhan Raza
Abstract:
"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting
Procedia PDF Downloads 10526193 Deep Learning Based Road Crack Detection on an Embedded Platform
Authors: Nurhak Altın, Ayhan Kucukmanisa, Oguzhan Urhan
Abstract:
It is important that highways are in good condition for traffic safety. Road crashes (road cracks, erosion of lane markings, etc.) can cause accidents by affecting driving. Image processing based methods for detecting road cracks are available in the literature. In this paper, a deep learning based road crack detection approach is proposed. YOLO (You Look Only Once) is adopted as core component of the road crack detection approach presented. The YOLO network structure, which is developed for object detection, is trained with road crack images as a new class that is not previously used in YOLO. The performance of the proposed method is compared using different training methods: using randomly generated weights and training their own pre-trained weights (transfer learning). A similar training approach is applied to the simplified version of the YOLO network model (tiny yolo) and the results of the performance are examined. The developed system is able to process 8 fps on NVIDIA Jetson TX1 development kit.Keywords: deep learning, embedded platform, real-time processing, road crack detection
Procedia PDF Downloads 33926192 Molecular Diagnosis of a Virus Associated with Red Tip Disease and Its Detection by Non Destructive Sensor in Pineapple (Ananas comosus)
Authors: A. K. Faizah, G. Vadamalai, S. K. Balasundram, W. L. Lim
Abstract:
Pineapple (Ananas comosus) is a common crop in tropical and subtropical areas of the world. Malaysia once ranked as one of the top 3 pineapple producers in the world in the 60's and early 70's, after Hawaii and Brazil. Moreover, government’s recognition of the pineapple crop as one of priority commodities to be developed for the domestics and international markets in the National Agriculture Policy. However, pineapple industry in Malaysia still faces numerous challenges, one of which is the management of disease and pest. Red tip disease on pineapple was first recognized about 20 years ago in a commercial pineapple stand located in Simpang Renggam, Johor, Peninsular Malaysia. Since its discovery, there has been no confirmation on its causal agent of this disease. The epidemiology of red tip disease is still not fully understood. Nevertheless, the disease symptoms and the spread within the field seem to point toward viral infection. Bioassay test on nucleic acid extracted from the red tip-affected pineapple was done on Nicotiana tabacum cv. Coker by rubbing the extracted sap. Localised lesions were observed 3 weeks after inoculation. Negative staining of the fresh inoculated Nicotiana tabacum cv. Coker showed the presence of membrane-bound spherical particles with an average diameter of 94.25nm under transmission electron microscope. The shape and size of the particles were similar to tospovirus. SDS-PAGE analysis of partial purified virions from inoculated N. tabacum produced a strong and a faint protein bands with molecular mass of approximately 29 kDa and 55 kDa. Partial purified virions of symptomatic pineapple leaves from field showed bands with molecular mass of approximately 29 kDa, 39 kDa and 55kDa. These bands may indicate the nucleocapsid protein identity of tospovirus. Furthermore, a handheld sensor, Greenseeker, was used to detect red tip symptoms on pineapple non-destructively based on spectral reflectance, measured as Normalized Difference Vegetation Index (NDVI). Red tip severity was estimated and correlated with NDVI. Linear regression models were calibrated and tested developed in order to estimate red tip disease severity based on NDVI. Results showed a strong positive relationship between red tip disease severity and NDVI (r= 0.84).Keywords: pineapple, diagnosis, virus, NDVI
Procedia PDF Downloads 79126191 Bakla Po Ako (I Am Gay): A Case Study on the Communication Styles of Selected Filipino Gays in Disclosing Their Sexual Orientation to Their Parents
Authors: Bryan Christian Baybay, M. Francesca Ronario
Abstract:
This study is intended to answer the question “What are the communication styles of selected Filipino gays in breaking their silence on their sexual orientation to their parents?” In this regard, six cases of Filipino gay disclosures were examined through in-depth interviews. The participants were selected through purposive sampling and snowball technique. The theories, Rhetorical Sensitivity of Roderick Hart and Communicator Style of Robert Norton were used to analyze the gathered data and to give support to the communication attitudes, message processing, message rendering and communication styles exhibited in each disclosure. As secondary data and validation, parents and experts in the field of communication, sociology, and psychology were also interviewed and consulted. The study found that Filipino gays vary in the communication styles they use during the disclosure with their parents. All communication styles: impression-leaving, contentious, open, dramatic, dominant, precise, relaxed, friendly, animated, and communicator image were observed by the gays depending on their motivation, relationship and thoughts contemplated. These results lend ideas for future researchers to look into the communication patterns and/or styles of lesbians, bisexuals, transgenders and queers or expand researches on the same subject and the utilization of Social Judgment and Relational Dialectics theories in determining and analyzing LGBTQ communication.Keywords: communication attitudes, communication styles, Filipino gays, self-disclosure, sexual orientation
Procedia PDF Downloads 52326190 Implementation of Correlation-Based Data Analysis as a Preliminary Stage for the Prediction of Geometric Dimensions Using Machine Learning in the Forming of Car Seat Rails
Authors: Housein Deli, Loui Al-Shrouf, Hammoud Al Joumaa, Mohieddine Jelali
Abstract:
When forming metallic materials, fluctuations in material properties, process conditions, and wear lead to deviations in the component geometry. Several hundred features sometimes need to be measured, especially in the case of functional and safety-relevant components. These can only be measured offline due to the large number of features and the accuracy requirements. The risk of producing components outside the tolerances is minimized but not eliminated by the statistical evaluation of process capability and control measurements. The inspection intervals are based on the acceptable risk and are at the expense of productivity but remain reactive and, in some cases, considerably delayed. Due to the considerable progress made in the field of condition monitoring and measurement technology, permanently installed sensor systems in combination with machine learning and artificial intelligence, in particular, offer the potential to independently derive forecasts for component geometry and thus eliminate the risk of defective products - actively and preventively. The reliability of forecasts depends on the quality, completeness, and timeliness of the data. Measuring all geometric characteristics is neither sensible nor technically possible. This paper, therefore, uses the example of car seat rail production to discuss the necessary first step of feature selection and reduction by correlation analysis, as otherwise, it would not be possible to forecast components in real-time and inline. Four different car seat rails with an average of 130 features were selected and measured using a coordinate measuring machine (CMM). The run of such measuring programs alone takes up to 20 minutes. In practice, this results in the risk of faulty production of at least 2000 components that have to be sorted or scrapped if the measurement results are negative. Over a period of 2 months, all measurement data (> 200 measurements/ variant) was collected and evaluated using correlation analysis. As part of this study, the number of characteristics to be measured for all 6 car seat rail variants was reduced by over 80%. Specifically, direct correlations for almost 100 characteristics were proven for an average of 125 characteristics for 4 different products. A further 10 features correlate via indirect relationships so that the number of features required for a prediction could be reduced to less than 20. A correlation factor >0.8 was assumed for all correlations.Keywords: long-term SHM, condition monitoring, machine learning, correlation analysis, component prediction, wear prediction, regressions analysis
Procedia PDF Downloads 4926189 The School Threshold's Identity as a Place for Interaction: Research Project with the Participation of Elementary-School Children
Authors: Natalia Bazaiou
Abstract:
The school entrance is one of the most important places in the everyday lives of children. As an intersection between school and public realm of the city, it is characterized by gradations of porous and rigid boundaries. Depending on its function, it can serve as a threshold or as a boundary. Additionally, it is a spatial condition that facilitates a dialogue between the school and the city and draws content from both. School thresholds are important in supporting the role of the school as an important node in the city and a bridge between children's various everyday life dynamics by demonstrating prominent usage and meaning as a place that is open to the community as well as to possibilities and physical interaction. In this research, we examine the role of the "realm of the in-between" between school and city through the architecture workshops for children at Hill Memorial School in Athens, in which we explore children's perceptions, wishes, and ideas related to their familiar everyday places of transition from school to city and vice versa. Also discussed in the presentation are the writings of Herman Hertzberger, Aldo Van Eyck, Jaap Bakema and others.Keywords: threshold, city, play, identity, cinematic tools, children, school architecture
Procedia PDF Downloads 8126188 Terrestrial Laser Scans to Assess Aerial LiDAR Data
Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani
Abstract:
The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy
Procedia PDF Downloads 100