Search results for: processing schemes
1201 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon
Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn
Abstract:
The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.Keywords: land use and land cover change, change detection, image processing, support vector machines
Procedia PDF Downloads 1381200 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics
Authors: L. Freeborn
Abstract:
Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.Keywords: neuroimaging studies, research design, second language acquisition, task validity
Procedia PDF Downloads 1381199 Performance and Processing Evaluation of Solid Oxide Cells by Co-Sintering of GDC Buffer Layer and LSCF Air Electrode
Authors: Hyun-Jong Choi, Minjun Kwak, Doo-Won Seo, Sang-Kuk Woo, Sun-Dong Kim
Abstract:
Solid Oxide Cell(SOC) systems can contribute to the transition to the hydrogen society by utilized as a power and hydrogen generator by the electrochemical reaction with high efficiency at high operation temperature (>750 ℃). La1-xSrxCo1-yFeyO3, which is an air electrode, is occurred stability degradations due to reaction and delamination with yittria stabilized zirconia(YSZ) electrolyte in a water electrolysis mode. To complement this phenomenon SOCs need gadolinium doped ceria(GDC) buffer layer between electrolyte and air electrode. However, GDC buffer layer requires a high sintering temperature and it causes a reaction with YSZ electrolyte. This study carried out low temperature sintering of GDC layer by applying Cu-oxide as a sintering aid. The effect of a copper additive as a sintering aid to lower the sintering temperature for the construction of solid oxide fuel cells (SOFCs) was investigated. GDC buffer layer with 0.25-10 mol% CuO sintering aid was prepared by reacting GDC power and copper nitrate solution followed by heating at 600 ℃. The sintering of CuO-added GDC powder was optimized by investigating linear shrinkage, microstructure, grain size, ionic conductivity, and activation energy of CuO-GDC electrolytes at temperatures ranging from 1100 to 1400 ℃. The sintering temperature of the CuO-GDC electrolyte decreases from 1400 ℃ to 1100 ℃ by adding the CuO sintering aid. The ionic conductivity of the CuO-GDC electrolyte shows a maximum value at 0.5 mol% of CuO. However, the addition of CuO has no significant effects on the activation energy of GDC electrolyte. GDC-LSCF layers were co-sintering at 1050 and 1100 ℃ and button cell tests were carried out at 750 ℃.Keywords: Co-Sintering, GDC-LSCF, Sintering Aid, solid Oxide Cells
Procedia PDF Downloads 2451198 Tea (Camellia sinensis (L.) O. Kuntze) Typology in Kenya: A Review
Authors: Joseph Kimutai Langat
Abstract:
Tea typology is the science of classifying tea. This study was carried out between November 2023 and July 2024, whose main objective was to investigate the typological classification nomenclature of processed tea in the world, narrowing down to Kenya. Centres of origin, historical background, tea growing region, scientific naming system, market, fermentation levels, processing/ oxidation levels and cultural reasons are used to classify tea at present. Of these, the most common typology is by oxidation, and more specifically, by the production methods within the oxidation categories. While the Asian tea producing countries categorises tea products based on the decreasing oxidation levels during the manufacturing process: black tea, green tea, oolong tea and instant tea, Kenya’s tea typology system is based on the degree of fermentation process, i.e. black tea, purple tea, green tea and white tea. Tea is also classified into five categories: black tea, green tea, white tea, oolong tea, and dark tea. Black tea is the main tea processed and exported in Kenya, manufactured mainly by withering, rolling, or by use of cutting-tearing-curling (CTC) method that ensures efficient conversion of leaf herbage to made tea, oxidizing, and drying before being sorted into different grades. It is from these varied typological methods that this review paper concludes that different regions of the world use different classification nomenclature. Therefore, since tea typology is not standardized, it is recommended that a global tea regulator dealing in tea classification be created to standardize tea typology, with domestic in-country regulatory bodies in tea growing countries accredited to implement the global-wide typological agreements and resolutions.Keywords: classification, fermentation, oxidation, tea, typology
Procedia PDF Downloads 401197 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 3001196 Using Wearable Device with Neuron Network to Classify Severity of Sleep Disorder
Authors: Ru-Yin Yang, Chi Wu, Cheng-Yu Tsai, Yin-Tzu Lin, Wen-Te Liu
Abstract:
Background: Sleep breathing disorder (SDB) is a condition demonstrated by recurrent episodes of the airway obstruction leading to intermittent hypoxia and quality fragmentation during sleep time. However, the procedures for SDB severity examination remain complicated and costly. Objective: The objective of this study is to establish a simplified examination method for SDB by the respiratory impendence pattern sensor combining the signal processing and machine learning model. Methodologies: We records heart rate variability by the electrocardiogram and respiratory pattern by impendence. After the polysomnography (PSG) been done with the diagnosis of SDB by the apnea and hypopnea index (AHI), we calculate the episodes with the absence of flow and arousal index (AI) from device record. Subjects were divided into training and testing groups. Neuron network was used to establish a prediction model to classify the severity of the SDB by the AI, episodes, and body profiles. The performance was evaluated by classification in the testing group compared with PSG. Results: In this study, we enrolled 66 subjects (Male/Female: 37/29; Age:49.9±13.2) with the diagnosis of SDB in a sleep center in Taipei city, Taiwan, from 2015 to 2016. The accuracy from the confusion matrix on the test group by NN is 71.94 %. Conclusion: Based on the models, we established a prediction model for SDB by means of the wearable sensor. With more cases incoming and training, this system may be used to rapidly and automatically screen the risk of SDB in the future.Keywords: sleep breathing disorder, apnea and hypopnea index, body parameters, neuron network
Procedia PDF Downloads 1501195 Surfactant-Assisted Aqueous Extraction of Residual Oil from Palm-Pressed Mesocarp Fibre
Authors: Rabitah Zakaria, Chan M. Luan, Nor Hakimah Ramly
Abstract:
The extraction of vegetable oil using aqueous extraction process assisted by ionic extended surfactant has been investigated as an alternative to hexane extraction. However, the ionic extended surfactant has not been commercialised and its safety with respect to food processing is uncertain. Hence, food-grade non-ionic surfactants (Tween 20, Span 20, and Span 80) were proposed for the extraction of residual oil from palm-pressed mesocarp fibre. Palm-pressed mesocarp fibre contains a significant amount of residual oil ( 5-10 wt %) and its recovery is beneficial as the oil contains much higher content of vitamin E, carotenoids, and sterols compared to crude palm oil. In this study, the formulation of food-grade surfactants using a combination of high hydrophilic-lipophilic balance (HLB) surfactants and low HLB surfactants to produce micro-emulsion with very low interfacial tension (IFT) was investigated. The suitable surfactant formulation was used in the oil extraction process and the efficiency of the extraction was correlated with the IFT, droplet size and viscosity. It was found that a ternary surfactant mixture with a HLB value of 15 (82% Tween 20, 12% Span 20 and 6% Span 80) was able to produce micro-emulsion with very low IFT compared to other HLB combinations. Results suggested that the IFT and droplet size highly affect the oil recovery efficiency. Finally, optimization of the operating parameters shows that the highest extraction efficiency of 78% was achieved at 1:31 solid to liquid ratio, 2 wt % surfactant solution, temperature of 50˚C, and 50 minutes contact time.Keywords: food-grade surfactants, aqueous extraction of residual oil, palm-pressed mesocarp fibre, interfacial tension
Procedia PDF Downloads 3901194 Reorientation of Anisotropic Particles in Free Liquid Microjets
Authors: Mathias Schlenk, Susanne Seibt, Sabine Rosenfeldt, Josef Breu, Stephan Foerster
Abstract:
Thin liquid jets on micrometer scale play an important role in processing such as in fiber fabrication, inkjet printing, but also for sample delivery in modern synchrotron X-ray devices. In all these cases the liquid jets contain solvents and dissolved materials such as polymers, nanoparticles, fibers pigments or proteins. As liquid flow in liquid jets differs significantly from flow in capillaries and microchannels, particle localization and orientation will also be different. This is of critical importance for applications, which depend on well-defined homogeneous particle and fiber distribution and orientation in liquid jets. Investigations of particle orientation in liquid microjets of diluted solutions have been rare, despite their importance. With the arise of micro-focused X-ray beams it has become possible to scan across samples with micrometer resolution to locally analyse structure and orientation of the samples. In the present work, we used this method to scan across liquid microjets to determine the local distribution and orientation of anisotropic particles. The compromise wormlike block copolymer micelles as an example of long flexible fibrous structures, hectorite materials as a model of extended nanosheet structures, and gold nanorods as an illustration of short stiff cylinders to comprise all relevant anisotropic geometries. We find that due to the different velocity profile in the liquid jet, which resembles plug flow, the orientation of the particles which was generated in the capillary is lost or changed into non-oriented or bi-axially orientations depending on the geometrical shape of the particle.Keywords: anisotropic particles, liquid microjets, reorientation, SAXS
Procedia PDF Downloads 3391193 Project Design Deliverables Sequence (PDD)
Authors: Nahed Al-Hajeri
Abstract:
There are several reasons which lead to a delay in project completion, out of all, one main reason is the delay in deliverable processing, i.e. submission and review of documents. Most of the project cycles start with a list of deliverables but without a sequence of submission of the same, means without a direction to move, leading to overlapping of activities and more interdependencies. Hence Project Design Deliverables (PDD) is developed as a solution to Organize Transmittals (Documents/Drawings) received from contractors/consultants during different phases of an EPC (Engineering, Procurement, and Construction) projects, which gives proper direction to the stakeholders from the beginning, to reduce inter-discipline dependency, avoid overlapping of activities, provide a list of deliverables, sequence of activities, etc. PDD attempts to provide a list and sequencing of the engineering documents/drawings required during different phases of a Project which will benefit both client and Contractor in performing planned activities through timely submission and review of deliverables. This helps in ensuring improved quality and completion of Project in time. The successful implementation begins with a detailed understanding the specific challenges and requirements of the project. PDD will help to learn about vendor document submissions including general workflow, sequence and monitor the submission and review of the deliverables from the early stages of Project. This will provide an overview for the Submission of deliverables by the concerned during the projects in proper sequence. The goal of PDD is also to hold responsible and accountability of all stakeholders during complete project cycle. We believe that successful implementation of PDD with a detailed list of documents and their sequence will help organizations to achieve the project target.Keywords: EPC (Engineering, Procurement, and Construction), project design deliverables (PDD), econometrics sciences, management sciences
Procedia PDF Downloads 4001192 Elimination of Mixed-Culture Biofilms Using Biological Agents
Authors: Anita Vidacs, Csaba Vagvolgyi, Judit Krisch
Abstract:
The attachment of microorganisms to different surfaces and the development of biofilms can lead to outbreaks of food-borne diseases and economic losses due to perished food. In food processing environments, bacterial communities are generally formed by mixed cultures of different species. Plants are sources of several antimicrobial substances that may be potential candidates for the development of new disinfectants. We aimed to investigate cinnamon (Cinnamomum zeylanicum), marjoram (Origanum majorana), and thyme (Thymus vulgaris). Essential oils and their major components (cinnamaldehyde, terpinene-4-ol, and thymol) on four-species biofilms of E. coli, L. monocytogenes, P. putida, and S. aureus. Experiments had three parts: (i) determination of minimum bactericide concentration and the killing time with microdilution methods; (ii) elimination of the four-species 24– and 168-hours old biofilm from stainless steel, polypropylene, tile and wood surfaces; and (iii) comparing the disinfectant effect with industrial used per-acetic based sanitizer (HC-DPE). E. coli and P. putida were more resistant to investigated essential oils and their main components in biofilm, than L. monocytogenes and S. aureus. These Gram-negative bacteria were detected on the surfaces, where the natural based disinfectant had not total biofilm elimination effect. Most promoted solutions were the cinnamon essential oil and the terpinene-4-ol that could eradicate the biofilm from stainless steel, polypropylene and even from tile, too. They have a better disinfectant effect than HC-DPE. These natural agents can be used as alternative solutions in the battle against bacterial biofilms.Keywords: biofilm, essential oils, surfaces, terpinene-4-ol
Procedia PDF Downloads 1121191 Identification of Potential Small Molecule Regulators of PERK Kinase
Authors: Ireneusz Majsterek, Dariusz Pytel, J. Alan Diehl
Abstract:
PKR-like ER kinase (PERK) is serine/threonie endoplasmic reticulum (ER) transmembrane kinase activated during ER-stress. PERK can activate signaling pathways known as unfolded protein response (UPR). Attenuation of translation is mediated by PERK via phosphorylation of eukaryotic initiation factor 2α (eIF2α), which is necessary for translation initiation. PERK activation also directly contributes to activation of Nrf2 which regulates expression of anti-oxidant enzymes. An increased phosphorylation of eIF2α has been reported in Alzheimer disease (AD) patient hippocampus, indicating that PERK is activated in this disease. Recent data have revealed activation of PERK signaling in non-Hodgkins lymphomas. Results also revealed that loss of PERK limits mammary tumor cell growth in vitro and in vivo. Consistent with these observations, activation of UPR in vitro increases levels of the amyloid precursor protein (APP), the peptide from which beta-amyloid plaques (AB) fragments are derived. Finally, proteolytic processing of APP, including the cleavages that produce AB, largely occurs in the ER, and localization coincident with PERK activity. Thus, we expect that PERK-dependent signaling is critical for progression of many types of diseases (human cancer, neurodegenerative disease and other). Therefore, modulation of PERK activity may be a useful therapeutic target in the treatment of different diseases that fail to respond to traditional chemotherapeutic strategies, including Alzheimer’s disease. Our goal will be to developed therapeutic modalities targeting PERK activity.Keywords: PERK kinase, small molecule inhibitor, neurodegenerative disease, Alzheimer’s disease
Procedia PDF Downloads 4821190 Structural Damage Detection via Incomplete Model Data Using Output Data Only
Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan
Abstract:
Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation
Procedia PDF Downloads 3651189 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 3481188 Event Related Brain Potentials Evoked by Carmen in Musicians and Dancers
Authors: Hanna Poikonen, Petri Toiviainen, Mari Tervaniemi
Abstract:
Event-related potentials (ERPs) evoked by simple tones in the brain have been extensively studied. However, in reality the music surrounding us is spectrally and temporally complex and dynamic. Thus, the research using natural sounds is crucial in understanding the operation of the brain in its natural environment. Music is an excellent example of natural stimulation, which, in various forms, has always been an essential part of different cultures. In addition to sensory responses, music elicits vast cognitive and emotional processes in the brain. When compared to laymen, professional musicians have stronger ERP responses in processing individual musical features in simple tone sequences, such as changes in pitch, timbre and harmony. Here we show that the ERP responses evoked by rapid changes in individual musical features are more intense in musicians than in laymen, also while listening to long excerpts of the composition Carmen. Interestingly, for professional dancers, the amplitudes of the cognitive P300 response are weaker than for musicians but still stronger than for laymen. Also, the cognitive P300 latencies of musicians are significantly shorter whereas the latencies of laymen are significantly longer. In contrast, sensory N100 do not differ in amplitude or latency between musicians and laymen. These results, acquired from a novel ERP methodology for natural music, suggest that we can take the leap of studying the brain with long pieces of natural music also with the ERP method of electroencephalography (EEG), as has already been made with functional magnetic resonance (fMRI), as these two brain imaging devices complement each other.Keywords: electroencephalography, expertise, musical features, real-life music
Procedia PDF Downloads 4831187 INRAM-3DCNN: Multi-Scale Convolutional Neural Network Based on Residual and Attention Module Combined with Multilayer Perceptron for Hyperspectral Image Classification
Authors: Jianhong Xiang, Rui Sun, Linyu Wang
Abstract:
In recent years, due to the continuous improvement of deep learning theory, Convolutional Neural Network (CNN) has played a great superior performance in the research of Hyperspectral Image (HSI) classification. Since HSI has rich spatial-spectral information, only utilizing a single dimensional or single size convolutional kernel will limit the detailed feature information received by CNN, which limits the classification accuracy of HSI. In this paper, we design a multi-scale CNN with MLP based on residual and attention modules (INRAM-3DCNN) for the HSI classification task. We propose to use multiple 3D convolutional kernels to extract the packet feature information and fully learn the spatial-spectral features of HSI while designing residual 3D convolutional branches to avoid the decline of classification accuracy due to network degradation. Secondly, we also design the 2D Inception module with a joint channel attention mechanism to quickly extract key spatial feature information at different scales of HSI and reduce the complexity of the 3D model. Due to the high parallel processing capability and nonlinear global action of the Multilayer Perceptron (MLP), we use it in combination with the previous CNN structure for the final classification process. The experimental results on two HSI datasets show that the proposed INRAM-3DCNN method has superior classification performance and can perform the classification task excellently.Keywords: INRAM-3DCNN, residual, channel attention, hyperspectral image classification
Procedia PDF Downloads 791186 Effect of Saponin Enriched Soapwort Powder on Structural and Sensorial Properties of Turkish Delight
Authors: Ihsan Burak Cam, Ayhan Topuz
Abstract:
Turkish delight has been produced by bleaching the plain delight mix (refined sugar, water and starch) via soapwort extract and powdered sugar. Soapwort extract which contains high amount of saponin, is an additive used in Turkish delight and tahini halvah production to improve consistency, chewiness and color due to its bioactive saponin content by acting as emulsifier. In this study, soapwort powder has been produced by determining optimum process conditions of soapwort extract by using response-surface method. This extract has been enriched with saponin by reverse osmosis (contains %63 saponin in dry bases). Büchi mini spray dryer B-290 was used to produce spray-dried soapwort powder (aw=0.254) from the enriched soapwort concentrate. Processing steps optimization and saponin content enrichment of soapwort extract has been tested on Turkish Delight production. Delight samples, produced by soapwort powder and commercial extract (control), were compared in chewiness, springiness, stickiness, adhesiveness, hardness, color and sensorial characteristics. According to the results, all textural properties except hardness of delights produced by powder were found to be statistically different than control samples. Chewiness, springiness, stickiness, adhesiveness and hardness values of samples (delights produced by the powder / control delights) were determined to be 361.9/1406.7, 0.095/0.251, -120.3/-51.7, 781.9/1869.3, 3427.3g/3118.4g, respectively. According to the quality analysis that has been ran with the end products it has been determined that; there is no statistically negative effect of the soapwort extract and the soapwort powder on the color and the appearance of Turkish Delight.Keywords: saponin, delight, soapwort powder, spray drying
Procedia PDF Downloads 2531185 Genetic Algorithm and Multi Criteria Decision Making Approach for Compressive Sensing Based Direction of Arrival Estimation
Authors: Ekin Nurbaş
Abstract:
One of the essential challenges in array signal processing, which has drawn enormous research interest over the past several decades, is estimating the direction of arrival (DOA) of plane waves impinging on an array of sensors. In recent years, the Compressive Sensing based DoA estimation methods have been proposed by researchers, and it has been discovered that the Compressive Sensing (CS)-based algorithms achieved significant performances for DoA estimation even in scenarios where there are multiple coherent sources. On the other hand, the Genetic Algorithm, which is a method that provides a solution strategy inspired by natural selection, has been used in sparse representation problems in recent years and provides significant improvements in performance. With all of those in consideration, in this paper, a method that combines the Genetic Algorithm (GA) and the Multi-Criteria Decision Making (MCDM) approaches for Direction of Arrival (DoA) estimation in the Compressive Sensing (CS) framework is proposed. In this method, we generate a multi-objective optimization problem by splitting the norm minimization and reconstruction loss minimization parts of the Compressive Sensing algorithm. With the help of the Genetic Algorithm, multiple non-dominated solutions are achieved for the defined multi-objective optimization problem. Among the pareto-frontier solutions, the final solution is obtained with the multiple MCDM methods. Moreover, the performance of the proposed method is compared with the CS-based methods in the literature.Keywords: genetic algorithm, direction of arrival esitmation, multi criteria decision making, compressive sensing
Procedia PDF Downloads 1461184 Plant Identification Using Convolution Neural Network and Vision Transformer-Based Models
Authors: Virender Singh, Mathew Rees, Simon Hampton, Sivaram Annadurai
Abstract:
Plant identification is a challenging task that aims to identify the family, genus, and species according to plant morphological features. Automated deep learning-based computer vision algorithms are widely used for identifying plants and can help users narrow down the possibilities. However, numerous morphological similarities between and within species render correct classification difficult. In this paper, we tested custom convolution neural network (CNN) and vision transformer (ViT) based models using the PyTorch framework to classify plants. We used a large dataset of 88,000 provided by the Royal Horticultural Society (RHS) and a smaller dataset of 16,000 images from the PlantClef 2015 dataset for classifying plants at genus and species levels, respectively. Our results show that for classifying plants at the genus level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420 and other state-of-the-art CNN-based models suggested in previous studies on a similar dataset. ViT model achieved top accuracy of 83.3% for classifying plants at the genus level. For classifying plants at the species level, ViT models perform better compared to CNN-based models ResNet50 and ResNet-RS-420, with a top accuracy of 92.5%. We show that the correct set of augmentation techniques plays an important role in classification success. In conclusion, these results could help end users, professionals and the general public alike in identifying plants quicker and with improved accuracy.Keywords: plant identification, CNN, image processing, vision transformer, classification
Procedia PDF Downloads 1031183 The Functional Rehabilitation of Peri-Implant Tissue Defects: A Case Report
Authors: Özgür Öztürk, Cumhur Sipahi, Hande Yeşil
Abstract:
Implant retained restorations commonly consist of a metal-framework veneered with ceramic or composite facings. The increasing and expanding use of indirect resin composites in dentistry is a result of innovations in materials and processing techniques. Of special interest to the implant restorative field is the possibility that composites present significantly lower peak vertical and transverse forces transmitted at the peri-implant level compared to metal-ceramic supra structures in implant-supported restorations. A 43-year-old male patient referred to the department of prosthodontics for an implant retained fixed prosthesis. The clinical and radiographic examination of the patient demonstrated the presence of an implant in the right mandibular first molar tooth region. A considerable amount of marginal bone loss around the implant was detected in radiographic examinations combined with a remarkable peri-implant soft tissue deficiency. To minimize the chewing loads transmitted to the implant-bone interface it was decided to fabricate an indirect composite resin veneered single metal crown over a screw-retained abutment. At the end of the treatment, the functional and aesthetic deficiencies were fully compensated. After a 6 months clinical and radiographic follow-up period the not any additional pathologic invasion was detected in the implant-bone interface and implant retained restoration did not reveal any vehement complication.Keywords: dental implant, fixed partial dentures, indirect composite resin, peri-implant defects
Procedia PDF Downloads 2621182 Gas Phase Extraction: An Environmentally Sustainable and Effective Method for The Extraction and Recovery of Metal from Ores
Authors: Kolela J Nyembwe, Darlington C. Ashiegbu, Herman J. Potgieter
Abstract:
Over the past few decades, the demand for metals has increased significantly. This has led to a decrease and decline of high-grade ore over time and an increase in mineral complexity and matrix heterogeneity. In addition to that, there are rising concerns about greener processes and a sustainable environment. Due to these challenges, the mining and metal industry has been forced to develop new technologies that are able to economically process and recover metallic values from low-grade ores, materials having a metal content locked up in industrially processed residues (tailings and slag), and complex matrix mineral deposits. Several methods to address these issues have been developed, among which are ionic liquids (IL), heap leaching, and bioleaching. Recently, the gas phase extraction technique has been gaining interest because it eliminates many of the problems encountered in conventional mineral processing methods. The technique relies on the formation of volatile metal complexes, which can be removed from the residual solids by a carrier gas. The complexes can then be reduced using the appropriate method to obtain the metal and regenerate-recover the organic extractant. Laboratory work on the gas phase have been conducted for the extraction and recovery of aluminium (Al), iron (Fe), copper (Cu), chrome (Cr), nickel (Ni), lead (Pb), and vanadium V. In all cases the extraction revealed to depend of temperature and mineral surface area. The process technology appears very promising, offers the feasibility of recirculation, organic reagent regeneration, and has the potential to deliver on all promises of a “greener” process.Keywords: gas-phase extraction, hydrometallurgy, low-grade ore, sustainable environment
Procedia PDF Downloads 1321181 Comparison of Central Light Reflex Width-to-Retinal Vessel Diameter Ratio between Glaucoma and Normal Eyes by Using Edge Detection Technique
Authors: P. Siriarchawatana, K. Leungchavaphongse, N. Covavisaruch, K. Rojananuangnit, P. Boondaeng, N. Panyayingyong
Abstract:
Glaucoma is a disease that causes visual loss in adults. Glaucoma causes damage to the optic nerve and its overall pathophysiology is still not fully understood. Vasculopathy may be one of the possible causes of nerve damage. Photographic imaging of retinal vessels by fundus camera during eye examination may complement clinical management. This paper presents an innovation for measuring central light reflex width-to-retinal vessel diameter ratio (CRR) from digital retinal photographs. Using our edge detection technique, CRRs from glaucoma and normal eyes were compared to examine differences and associations. CRRs were evaluated on fundus photographs of participants from Mettapracharak (Wat Raikhing) Hospital in Nakhon Pathom, Thailand. Fifty-five photographs from normal eyes and twenty-one photographs from glaucoma eyes were included. Participants with hypertension were excluded. In each photograph, CRRs from four retinal vessels, including arteries and veins in the inferotemporal and superotemporal regions, were quantified using edge detection technique. From our finding, mean CRRs of all four retinal arteries and veins were significantly higher in persons with glaucoma than in those without glaucoma (0.34 vs. 0.32, p < 0.05 for inferotemporal vein, 0.33 vs. 0.30, p < 0.01 for inferotemporal artery, 0.34 vs. 0.31, p < 0.01 for superotemporal vein, and 0.33 vs. 0.30, p < 0.05 for superotemporal artery). From these results, an increase in CRRs of retinal vessels, as quantitatively measured from fundus photographs, could be associated with glaucoma.Keywords: glaucoma, retinal vessel, central light reflex, image processing, fundus photograph, edge detection
Procedia PDF Downloads 3251180 Effect of Cellular Water Transport on Deformation of Food Material during Drying
Authors: M. Imran Hossen Khan, M. Mahiuddin, M. A. Karim
Abstract:
Drying is a food processing technique where simultaneous heat and mass transfer take place from surface to the center of the sample. Deformation of food materials during drying is a common physical phenomenon which affects the textural quality and taste of the dried product. Most of the plant-based food materials are porous and hygroscopic in nature that contains about 80-90% water in different cellular environments: intercellular environment and intracellular environment. Transport of this cellular water has a significant effect on material deformation during drying. However, understanding of the scale of deformation is very complex due to diverse nature and structural heterogeneity of food material. Knowledge about the effect of transport of cellular water on deformation of material during drying is crucial for increasing the energy efficiency and obtaining better quality dried foods. Therefore, the primary aim of this work is to investigate the effect of intracellular water transport on material deformation during drying. In this study, apple tissue was taken for the investigation. The experiment was carried out using 1H-NMR T2 relaxometry with a conventional dryer. The experimental results are consistent with the understanding that transport of intracellular water causes cellular shrinkage associated with the anisotropic deformation of whole apple tissue. Interestingly, it is found that the deformation of apple tissue takes place at different stages of drying rather than deforming at one time. Moreover, it is found that the penetration rate of heat energy together with the pressure gradient between intracellular and intercellular environments is the responsible force to rupture the cell membrane.Keywords: heat and mass transfer, food material, intracellular water, cell rupture, deformation
Procedia PDF Downloads 2211179 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects
Authors: Ma Yuzhe, Burra Venkata Durga Kumar
Abstract:
The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.Keywords: Linux, operating system, system management, security
Procedia PDF Downloads 1081178 Classifier for Liver Ultrasound Images
Authors: Soumya Sajjan
Abstract:
Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix
Procedia PDF Downloads 4111177 Design and Testing of Electrical Capacitance Tomography Sensors for Oil Pipeline Monitoring
Authors: Sidi M. A. Ghaly, Mohammad O. Khan, Mohammed Shalaby, Khaled A. Al-Snaie
Abstract:
Electrical capacitance tomography (ECT) is a valuable, non-invasive technique used to monitor multiphase flow processes, especially within industrial pipelines. This study focuses on the design, testing, and performance comparison of ECT sensors configured with 8, 12, and 16 electrodes, aiming to evaluate their effectiveness in imaging accuracy, resolution, and sensitivity. Each sensor configuration was designed to capture the spatial permittivity distribution within a pipeline cross-section, enabling visualization of phase distribution and flow characteristics such as oil and water interactions. The sensor designs were implemented and tested in closed pipes to assess their response to varying flow regimes. Capacitance data collected from each electrode configuration were reconstructed into cross-sectional images, enabling a comparison of image resolution, noise levels, and computational demands. Results indicate that the 16-electrode configuration yields higher image resolution and sensitivity to phase boundaries compared to the 8- and 12-electrode setups, making it more suitable for complex flow visualization. However, the 8 and 12-electrode sensors demonstrated advantages in processing speed and lower computational requirements. This comparative analysis provides critical insights into optimizing ECT sensor design based on specific industrial requirements, from high-resolution imaging to real-time monitoring needs.Keywords: capacitance tomography, modeling, simulation, electrode, permittivity, fluid dynamics, imaging sensitivity measurement
Procedia PDF Downloads 101176 Histone Deacetylases Inhibitor - Valproic Acid Sensitizes Human Melanoma Cells for alkylating agent and PARP inhibitor
Authors: Małgorzata Drzewiecka, Tomasz Śliwiński, Maciej Radek
Abstract:
The inhibition of histone deacetyles (HDACs) holds promise as a potential anti-cancer therapy because histone and non-histone protein acetylation is frequently disrupted in cancer, leading to cancer initiation and progression. Additionally, histone deacetylase inhibitors (HDACi) such as class I HDAC inhibitor - valproic acid (VPA) have been shown to enhance the effectiveness of DNA-damaging factors, such as cisplatin or radiation. In this study, we found that, using of VPA in combination with talazoparib (BMN-637 – PARP1 inhibitor – PARPi) and/or Dacarabazine (DTIC - alkylating agent) resulted in increased DNA double strand break (DSB) and reduced survival (while not affecting primary melanocytes )and proliferation of melanoma cells. Furthermore, pharmacologic inhibition of class I HDACs sensitizes melanoma cells to apoptosis following exposure to DTIC and BMN-637. In addition, inhibition of HDAC caused sensitization of melanoma cells to dacarbazine and BMN-637 in melanoma xenografts in vivo. At the mRNA and protein level histone deacetylase inhibitor downregulated RAD51 and FANCD2. This study provides that combining HDACi, alkylating agent and PARPi could potentially enhance the treatment of melanoma, which is known for being one of the most aggressive malignant tumors. The findings presented here point to a scenario in which HDAC via enhancing the HR-dependent repair of DSBs created during the processing of DNA lesions, are essential nodes in the resistance of malignant melanoma cells to methylating agent-based therapies.Keywords: melanoma, hdac, parp inhibitor, valproic acid
Procedia PDF Downloads 821175 Assessment of Heavy Metals and Radionuclide Concentrations in Mafikeng Waste Water Treatment Plant
Authors: M. Mathuthu, N. N. Gaxela, R. Y. Olobatoke
Abstract:
A study was carried out to assess the heavy metal and radionuclide concentrations of water from the waste water treatment plant in Mafikeng Local Municipality to evaluate treatment efficiency. Ten water samples were collected from various stages of water treatment which included sewage delivered to the plant, the two treatment stages and the effluent and also the community. The samples were analyzed for heavy metal content using Inductive Coupled Plasma Mass Spectrometer. Gross α/β activity concentration in water samples was evaluated by Liquid Scintillation Counting whereas the concentration of individual radionuclides was measured by gamma spectroscopy. The results showed marked reduction in the levels of heavy metal concentration from 3 µg/L (As)–670 µg/L (Na) in sewage into the plant to 2 µg/L (As)–170 µg/L (Fe) in the effluent. Beta activity was not detected in water samples except in the in-coming sewage, the concentration of which was within reference limits. However, the gross α activity in all the water samples (7.7-8.02 Bq/L) exceeded the 0.1 Bq/L limit set by World Health Organization (WHO). Gamma spectroscopy analysis revealed very high concentrations of 235U and 226Ra in water samples, with the lowest concentrations (9.35 and 5.44 Bq/L respectively) in the in-coming sewage and highest concentrations (73.8 and 47 Bq/L respectively) in the community water suggesting contamination along water processing line. All the values were considerably higher than the limits of South Africa Target Water Quality Range and WHO. However, the estimated total doses of the two radionuclides for the analyzed water samples (10.62 - 45.40 µSv yr-1) were all well below the reference level of the committed effective dose of 100 µSv yr-1 recommended by WHO.Keywords: gross α/β activity, heavy metals, radionuclides, 235U, 226Ra, water sample
Procedia PDF Downloads 4481174 Optimization of Reliability Test Plans: Increase Wafer Fabrication Equipments Uptime
Authors: Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta, Ahmed Zeouita
Abstract:
Semiconductor processing chambers tend to operate in controlled but aggressive operating conditions (chemistry, plasma, high temperature etc.) Owing to this, the design of this equipment requires developing robust and reliable hardware and software. Any equipment downtime due to reliability issues can have cost implications both for customers in terms of tool downtime (reduced throughput) and for equipment manufacturers in terms of high warranty costs and customer trust deficit. A thorough reliability assessment of critical parts and a plan for preventive maintenance/replacement schedules need to be done before tool shipment. This helps to save significant warranty costs and tool downtimes in the field. However, designing a proper reliability test plan to accurately demonstrate reliability targets with proper sample size and test duration is quite challenging. This is mainly because components can fail in different failure modes that fit into different Weibull beta value distributions. Without apriori Weibull beta of a failure mode under consideration, it always leads to over/under utilization of resources, which eventually end up in false positives or false negatives estimates. This paper proposes a methodology to design a reliability test plan with optimal model size/duration/both (independent of apriori Weibull beta). This methodology can be used in demonstration tests and can be extended to accelerated life tests to further decrease sample size/test duration.Keywords: reliability, stochastics, preventive maintenance
Procedia PDF Downloads 151173 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities
Authors: Salman Naseer
Abstract:
One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission
Procedia PDF Downloads 1421172 Early Depression Detection for Young Adults with a Psychiatric and AI Interdisciplinary Multimodal Framework
Authors: Raymond Xu, Ashley Hua, Andrew Wang, Yuru Lin
Abstract:
During COVID-19, the depression rate has increased dramatically. Young adults are most vulnerable to the mental health effects of the pandemic. Lower-income families have a higher ratio to be diagnosed with depression than the general population, but less access to clinics. This research aims to achieve early depression detection at low cost, large scale, and high accuracy with an interdisciplinary approach by incorporating clinical practices defined by American Psychiatric Association (APA) as well as multimodal AI framework. The proposed approach detected the nine depression symptoms with Natural Language Processing sentiment analysis and a symptom-based Lexicon uniquely designed for young adults. The experiments were conducted on the multimedia survey results from adolescents and young adults and unbiased Twitter communications. The result was further aggregated with the facial emotional cues analyzed by the Convolutional Neural Network on the multimedia survey videos. Five experiments each conducted on 10k data entries reached consistent results with an average accuracy of 88.31%, higher than the existing natural language analysis models. This approach can reach 300+ million daily active Twitter users and is highly accessible by low-income populations to promote early depression detection to raise awareness in adolescents and young adults and reveal complementary cues to assist clinical depression diagnosis.Keywords: artificial intelligence, COVID-19, depression detection, psychiatric disorder
Procedia PDF Downloads 131