Search results for: Lee Filter
234 Design and Simulation of Low Cost Boost-Half- Bridge Microinverter with Grid Connection
Authors: P. Bhavya, P. R. Jayasree
Abstract:
This paper presents a low cost transformer isolated boost half bridge micro-inverter for single phase grid connected PV system. Since the output voltage of a single PV panel is as low as 20~50V, a high voltage gain inverter is required for the PV panel to connect to the single-phase grid. The micro-inverter has two stages, an isolated dc-dc converter stage and an inverter stage with a dc link. To achieve MPPT and to step up the PV voltage to the dc link voltage, a transformer isolated boost half bridge dc-dc converter is used. To output the synchronised sinusoidal current with unity power factor to the grid, a pulse width modulated full bridge inverter with LCL filter is used. Variable step size Maximum Power Point Tracking (MPPT) method is adopted such that fast tracking and high MPPT efficiency are both obtained. AC voltage as per grid requirement is obtained at the output of the inverter. High power factor (>0.99) is obtained at both heavy and light loads. This paper gives the results of computer simulation program of a grid connected solar PV system using MATLAB/Simulink and SIM Power System tool.Keywords: boost-half-bridge, micro-inverter, maximum power point tracking, grid connection, MATLAB/Simulink
Procedia PDF Downloads 336233 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image
Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa
Abstract:
A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever
Procedia PDF Downloads 118232 Spatial Object-Oriented Template Matching Algorithm Using Normalized Cross-Correlation Criterion for Tracking Aerial Image Scene
Authors: Jigg Pelayo, Ricardo Villar
Abstract:
Leaning on the development of aerial laser scanning in the Philippine geospatial industry, researches about remote sensing and machine vision technology became a trend. Object detection via template matching is one of its application which characterized to be fast and in real time. The paper purposely attempts to provide application for robust pattern matching algorithm based on the normalized cross correlation (NCC) criterion function subjected in Object-based image analysis (OBIA) utilizing high-resolution aerial imagery and low density LiDAR data. The height information from laser scanning provides effective partitioning order, thus improving the hierarchal class feature pattern which allows to skip unnecessary calculation. Since detection is executed in the object-oriented platform, mathematical morphology and multi-level filter algorithms were established to effectively avoid the influence of noise, small distortion and fluctuating image saturation that affect the rate of recognition of features. Furthermore, the scheme is evaluated to recognized the performance in different situations and inspect the computational complexities of the algorithms. Its effectiveness is demonstrated in areas of Misamis Oriental province, achieving an overall accuracy of 91% above. Also, the garnered results portray the potential and efficiency of the implemented algorithm under different lighting conditions.Keywords: algorithm, LiDAR, object recognition, OBIA
Procedia PDF Downloads 242231 Feasibility Study of Constructed Wetlands for Wastewater Treatment and Reuse in Asmara, Eritrea
Authors: Hagos Gebrehiwet Bahta
Abstract:
Asmara, the capital city of Eritrea, is facing a sanitation challenge because the city discharges its wastewater to the environment without any kind of treatment. The aim of this research is to conduct a pre-feasibility study of using constructed wetlands in the peri-urban areas of Asmara for wastewater treatment and reuse. It was found that around 15,000 m³ of wastewater is used daily for agricultural activities, and products are sold in the city's markets, which are claimed to cause some health effects. In this study, three potential sites were investigated around Mai-Bela and an optimum location was selected on the basis of land availability, topography, and geotechnical information. Some types of local microphytes that can be used in constructed wetlands have been identified and documented for further studies. It was found that subsurface constructed wetlands can provide a sufficient pollutant removal with careful planning and design. Following the feasibility study, a preliminary design of screening, grit chamber and subsurface constructed wetland was prepared and cost estimation was done. In the cost estimation part, the filter media was found to be the most expensive part and consists of around 30% percent of the overall cost. The city wastewater drainage runs in two directions and the selected site is located in the southern sub-system, which only carries sewage (separate system). The wastewater analysis conducted particularly around this area (Sembel) indicates high heavy metal levels and organic concentrations, which reveals that there is a high level of industrial pollution in addition to the domestic sewage.Keywords: agriculture, constructed wetland, Mai-Bela, wastewater reuse
Procedia PDF Downloads 211230 Automatic Reporting System for Transcriptome Indel Identification and Annotation Based on Snapshot of Next-Generation Sequencing Reads Alignment
Authors: Shuo Mu, Guangzhi Jiang, Jinsa Chen
Abstract:
The analysis of Indel for RNA sequencing of clinical samples is easily affected by sequencing experiment errors and software selection. In order to improve the efficiency and accuracy of analysis, we developed an automatic reporting system for Indel recognition and annotation based on image snapshot of transcriptome reads alignment. This system includes sequence local-assembly and realignment, target point snapshot, and image-based recognition processes. We integrated high-confidence Indel dataset from several known databases as a training set to improve the accuracy of image processing and added a bioinformatical processing module to annotate and filter Indel artifacts. Subsequently, the system will automatically generate data, including data quality levels and images results report. Sanger sequencing verification of the reference Indel mutation of cell line NA12878 showed that the process can achieve 83% sensitivity and 96% specificity. Analysis of the collected clinical samples showed that the interpretation accuracy of the process was equivalent to that of manual inspection, and the processing efficiency showed a significant improvement. This work shows the feasibility of accurate Indel analysis of clinical next-generation sequencing (NGS) transcriptome. This result may be useful for RNA study for clinical samples with microsatellite instability in immunotherapy in the future.Keywords: automatic reporting, indel, next-generation sequencing, NGS, transcriptome
Procedia PDF Downloads 189229 The Research on Diesel Bus Emissions in Ulaanbaatar City: Mongolia
Authors: Tsetsegmaa A., Bayarsuren B., Altantsetseg Ts.
Abstract:
To make the best decision on reducing harmful emissions from buses, we need to have a clear understanding of the current state of their actual emissions. The emissions from city buses running on high sulfur fuel, particularly particulate matter (PM) and nitrogen oxides (NOx) from the exhaust gases of conventional diesel engines, have been studied and measured with and without diesel particulate filter (DPF) in Ulaanbaatar city. The study was conducted by using the PEMS (Portable Emissions Measurement System) and gravimetric method in real traffic conditions. The obtained data were used to determine the actual emission rates and to evaluate the effectiveness of the selected particulate filters. Actual road and daily PM emissions from city buses were determined during the warm and cold seasons. A bus with an average daily mileage of 242 km was found to emit 166.155 g of PM into the city's atmosphere on average per day, with 141.3 g in summer and 175.8 g in winter. The actual PM of the city bus is 0.6866 g/km. The concentration of NOx in the exhaust gas averages 1410.94 ppm. The use of DPF reduced the exhaust gas opacity of 24 buses by an average of 97% and filtered a total of 340.4 kg of soot from these buses over a period of six months. Retrofitting an old conventional diesel engine with cassette-type silicon carbide (SiC) DPF, despite the laboriousness of cleaning, can significantly reduce particulate matter emissions. Innovation: First comprehensive road PM and NOx emission dataset and actual road emissions from public buses have been identified. PM and NOx mathematical model equations have been estimated as a function of the bus technical speed and engine revolution with and without DPF.Keywords: conventional diesel, silicon carbide, real-time onboard measurements, particulate matter, diesel retrofit, fuel sulphur
Procedia PDF Downloads 163228 The Evaluation of Antioxidant Activity of Aloe Vera (Aloe barbadensis miller)
Authors: R. A. Akande, M. L. Mnisi
Abstract:
Introduction: Aloe vera (Aloe barbadensis miller) flowers are carried in a large candelabra-like flower-head. Aloe barbadensis miller has been known as a traditional herbal medicine for the treatment of many diseases and sicknesses mainly for skin conditions such as sunburns, cold sores and frostbite. It is also used as a fresh food preservative. The main objective of this study is to determine the antioxidant activity of Aloe barbadensis miller. Methodology: The plant material (3g) was separately extracted with 30 mL of solvent with varying polarities (methanol and ethyl acetate)(technical grade, Merck) in 50ml polyester centrifuge tubes. The tubes was be shaken for 30 minutes on a linear shaker and left over night. The supernatant was filtered using a Whitman No. 1 filter paper before being transferred into pre-weighed glass containers. The solvent was allowed to evaporate under a fan in a room to quantify extraction efficacy. The, tin layer chromatography(TLC) plates were prepared and Pasteur pipette was used for spotting each extractant (methanol and ethyl acetate) on the TLC plates and the plate was developed in saturated TLC tank .and dipped in vanillin sulphuric acid mixture and heated at 110 to detect separate compound .and dipped in DDPH in methanol to detect antioxidant. Expected contribution to knowledge: It was observed that different compounds which interact differently with different solvent such as methanol, ethyl acetate having difference polarities were observed. The yellow spots also observed from the plate dipped in DDPH indicate that Aloe barbadensis miller has antioxidant.Keywords: antioxidant activity, Aloe barbadensis miller, tin layer chromatography, DDPH
Procedia PDF Downloads 446227 Advantages of Multispectral Imaging for Accurate Gas Temperature Profile Retrieval from Fire Combustion Reactions
Authors: Jean-Philippe Gagnon, Benjamin Saute, Stéphane Boubanga-Tombet
Abstract:
Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. However, it is well known that most combustion gases such as carbon dioxide (CO₂), water vapor (H₂O), and carbon monoxide (CO) selectively absorb/emit infrared radiation at discrete energies, i.e., over a very narrow spectral range. Therefore, temperature profiles of most combustion processes derived from conventional broadband imaging are inaccurate without prior knowledge or assumptions about the spectral emissivity properties of the combustion gases. Using spectral filters allows estimating these critical emissivity parameters in addition to providing selectivity regarding the chemical nature of the combustion gases. However, due to the turbulent nature of most flames, it is crucial that such information be obtained without sacrificing temporal resolution. For this reason, Telops has developed a time-resolved multispectral imaging system which combines a high-performance broadband camera synchronized with a rotating spectral filter wheel. In order to illustrate the benefits of using this system to characterize combustion experiments, measurements were carried out using a Telops MS-IR MW on a very simple combustion system: a wood fire. The temperature profiles calculated using the spectral information from the different channels were compared with corresponding temperature profiles obtained with conventional broadband imaging. The results illustrate the benefits of the Telops MS-IR cameras for the characterization of laminar and turbulent combustion systems at a high temporal resolution.Keywords: infrared, multispectral, fire, broadband, gas temperature, IR camera
Procedia PDF Downloads 142226 Suspended Sediment Sources Fingerprinting in Ashebeka River Catchment, Assela, Central Ethiopia
Authors: Getachew Mekaa, Bezatu Mengisteb, Tena Alamirewc
Abstract:
Ashebeka River is the main source of drinking water supply for Assela City and its surrounding inhabitants. Apart from seasonal water reliability disruption, the cost of treating water downstream of the river has been increasing over time due to increased pollutants and suspended sediments. Therefore, this research aimed to identify geo-location and prioritize suspended sediment sources in the Ashebeka River catchment using sediment fingerprinting. We collected 58 composite soil samples and a river water sample for suspended sediment samples from the outlet, which were then filtered using Whatman filter paper. The samples were quantified for geochemical tracers with multi-element capability, and inductively coupled plasma-optical emission spectrometry (ICP-OES). Tracers with significant p-value and that passed the Kruskal-Wallis (KW) test were analyzed for stepwise discriminant function analysis (DFA). The DFA results revealed tracers with good discrimination were subsequently used for the mixed model analysis. The relative significant sediment source contributions from sub-catchments (km2): 3, 4, 1, and 2 were estimated as 49.31% (8), 26.71% (5), 23.65% (5.6), and 0.33% (28.4) respectively. The findings of this study will help the water utilities to prioritize areas of intervention, and the approach used could be followed for catchment prioritization in water safety plan development. Moreover, the findings of this research shed light on the integration of sediment fingerprinting into water safety plans to ensure the reliability of drinking water supplies.Keywords: disruption of drinking water reliability, ashebeka river catchment, sediment fingerprinting, sediment source contribution, mixed model
Procedia PDF Downloads 24225 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants
Authors: Balgaisha Mukanova, Natalya Glazyrina, Sergey Glazyrin
Abstract:
The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.Keywords: direct problem, multiparametric optimization, optimization parameters, water treatment
Procedia PDF Downloads 385224 A Robust Spatial Feature Extraction Method for Facial Expression Recognition
Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda
Abstract:
This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure
Procedia PDF Downloads 423223 Solar-Powered Water Purification Using Ozone and Sand Filtration
Authors: Kayla Youhanaie, Kenneth Dott, Greg Gillis-Smith
Abstract:
Access to clean water is a global challenge that affects nearly one-third of the world’s population. A lack of safe drinking water negatively affects a person’s health, safety, and economic status. However, many regions of the world that face this clean water challenge also have high solar energy potential. To address this worldwide issue and utilize available resources, a solar-powered water purification device was developed that could be implemented in communities around the world that lack access to potable water. The device uses ozone to destroy water-borne pathogens and sand filtration to filter out particulates from the water. To select the best method for this application, a quantitative energy efficiency comparison of three water purification methods was conducted: heat, UV light, and ozone. After constructing an initial prototype, the efficacy of the device was tested using agar petri dishes to test for bacteria growth in treated water samples at various time intervals after applying the device to contaminated water. The results demonstrated that the water purification device successfully removed all bacteria and particulates from the water within three minutes, making it safe for human consumption. These results, as well as the proposed design that utilizes widely available resources in target communities, suggest that the device is a sustainable solution to address the global water crisis and could improve the quality of life for millions of people worldwide.Keywords: clean water, solar powered water purification, ozonation, sand filtration, global water crisis
Procedia PDF Downloads 74222 The Use of Information and Communication Technologies in Electoral Procedures: Comments on Electronic Voting Security
Authors: Magdalena Musiał-Karg
Abstract:
The expansion of telecommunication and progress of electronic media constitute important elements of our times. The recent worldwide convergence of information and communication technologies (ICT) and dynamic development of the mass media is leading to noticeable changes in the functioning of contemporary states and societies. Currently, modern technologies play more and more important roles and filter down to almost every field of contemporary human life. It results in the growth of online interactions that can be observed by the inconceivable increase in the number of people with home PCs and Internet access. The proof of it is undoubtedly the emergence and use of concepts such as e-society, e-banking, e-services, e-government, e-government, e-participation and e-democracy. The newly coined word e-democracy evidences that modern technologies have also been widely used in politics. Without any doubt in most countries all actors of political market (politicians, political parties, servants in political/public sector, media) use modern forms of communication with the society. Most of these modern technologies progress the processes of getting and sending information to the citizens, communication with the electorate, and also – which seems to be the biggest advantage – electoral procedures. Thanks to implementation of ICT the interaction between politicians and electorate are improved. The main goal of this text is to analyze electronic voting (e-voting) as one of the important forms of electronic democracy in terms of security aspects. The author of this paper aimed at answering the questions of security of electronic voting as an additional form of participation in elections and referenda.Keywords: electronic democracy, electronic voting, security of e-voting, information and communication technology (ICT)
Procedia PDF Downloads 238221 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms
Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan
Abstract:
Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving k-means clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.Keywords: acute leukaemia images, clustering algorithms, image segmentation, moving k-means
Procedia PDF Downloads 290220 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement
Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini
Abstract:
Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis
Procedia PDF Downloads 135219 FACTS Based Stabilization for Smart Grid Applications
Authors: Adel. M. Sharaf, Foad H. Gandoman
Abstract:
Nowadays, Photovoltaic-PV Farms/ Parks and large PV-Smart Grid Interface Schemes are emerging and commonly utilized in Renewable Energy distributed generation. However, PV-hybrid-Dc-Ac Schemes using interface power electronic converters usually has negative impact on power quality and stabilization of modern electrical network under load excursions and network fault conditions in smart grid. Consequently, robust FACTS based interface schemes are required to ensure efficient energy utilization and stabilization of bus voltages as well as limiting switching/fault onrush current condition. FACTS devices are also used in smart grid-Battery Interface and Storage Schemes with PV-Battery Storage hybrid systems as an elegant alternative to renewable energy utilization with backup battery storage for electric utility energy and demand side management to provide needed energy and power capacity under heavy load conditions. The paper presents a robust interface PV-Li-Ion Battery Storage Interface Scheme for Distribution/Utilization Low Voltage Interface using FACTS stabilization enhancement and dynamic maximum PV power tracking controllers. Digital simulation and validation of the proposed scheme is done using MATLAB/Simulink software environment for Low Voltage- Distribution/Utilization system feeding a hybrid Linear-Motorized inrush and nonlinear type loads from a DC-AC Interface VSC-6-pulse Inverter Fed from the PV Park/Farm with a back-up Li-Ion Storage Battery.Keywords: AC FACTS, smart grid, stabilization, PV-battery storage, Switched Filter-Compensation (SFC)
Procedia PDF Downloads 409218 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas
Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards
Abstract:
Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.Keywords: airborne laser scanning, digital terrain models, filtering, forested areas
Procedia PDF Downloads 138217 The BNCT Project Using the Cf-252 Source: Monte Carlo Simulations
Authors: Marta Błażkiewicz-Mazurek, Adam Konefał
Abstract:
The project can be divided into three main parts: i. modeling the Cf-252 neutron source and conducting an experiment to verify the correctness of the obtained results, ii. design of the BNCT system infrastructure, iii. analysis of the results from the logical detector. Modeling of the Cf-252 source included designing the shape and size of the source as well as the energy and spatial distribution of emitted neutrons. Two options were considered: a point source and a cylindrical spatial source. The energy distribution corresponded to various spectra taken from specialized literature. Directionally isotropic neutron emission was simulated. The simulation results were compared with experimental values determined using the activation detector method using indium foils and cadmium shields. The relative fluence rate of thermal and resonance neutrons was compared in the chosen places in the vicinity of the source. The second part of the project related to the modeling of the BNCT infrastructure consisted of developing a simulation program taking into account all the essential components of this system. Materials with moderating, absorbing, and backscattering properties of neutrons were adopted into the project. Additionally, a gamma radiation filter was introduced into the beam output system. The analysis of the simulation results obtained using a logical detector located at the beam exit from the BNCT infrastructure included neutron energy and their spatial distribution. Optimization of the system involved changing the size and materials of the system to obtain a suitable collimated beam of thermal neutrons.Keywords: BNCT, Monte Carlo, neutrons, simulation, modeling
Procedia PDF Downloads 28216 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery
Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang
Abstract:
Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram
Procedia PDF Downloads 73215 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis
Procedia PDF Downloads 324214 Quantification of Hydrogen Sulfide and Methyl Mercaptan in Air Samples from a Waste Management Facilities
Authors: R. F. Vieira, S. A. Figueiredo, O. M. Freitas, V. F. Domingues, C. Delerue-Matos
Abstract:
The presence of sulphur compounds like hydrogen sulphide and mercaptans is one of the reasons for waste-water treatment and waste management being associated with odour emissions. In this context having a quantifying method for these compounds helps in the optimization of treatment with the goal of their elimination, namely biofiltration processes. The aim of this study was the development of a method for quantification of odorous gases in waste treatment plants air samples. A method based on head space solid phase microextraction (HS-SPME) coupled with gas chromatography - flame photometric detector (GC-FPD) was used to analyse H2S and Metil Mercaptan (MM). The extraction was carried out with a 75-μm Carboxen-polydimethylsiloxane fiber coating at 22 ºC for 20 min, and analysed by a GC 2010 Plus A from Shimadzu with a sulphur filter detector: splitless mode (0.3 min), the column temperature program was from 60 ºC, increased by 15 ºC/min to 100 ºC (2 min). The injector temperature was held at 250 ºC, and the detector at 260 ºC. For calibration curve a gas diluter equipment (digital Hovagas G2 - Multi Component Gas Mixer) was used to do the standards. This unit had two input connections, one for a stream of the dilute gas and another for a stream of nitrogen and an output connected to a glass bulb. A 40 ppm H2S and a 50 ppm MM cylinders were used. The equipment was programmed to the selected concentration, and it automatically carried out the dilution to the glass bulb. The mixture was left flowing through the glass bulb for 5 min and then the extremities were closed. This method allowed the calibration between 1-20 ppm for H2S and 0.02-0.1 ppm and 1-3.5 ppm for MM. Several quantifications of air samples from inlet and outlet of a biofilter operating in a waste management facility in the north of Portugal allowed the evaluation the biofilters performance.Keywords: biofiltration, hydrogen sulphide, mercaptans, quantification
Procedia PDF Downloads 474213 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering
Abstract:
Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.Keywords: carbon composite, fault detection, fault identification, particle filter
Procedia PDF Downloads 193212 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 467211 Ecological-Economics Evaluation of Water Treatment Systems
Authors: Hwasuk Jung, Seoi Lee, Dongchoon Ryou, Pyungjong Yoo, Seokmo Lee
Abstract:
The Nakdong River being used as drinking water sources for Pusan metropolitan city has the vulnerability of water management due to the fact that industrial areas are located in the upper Nakdong River. Most citizens of Busan think that the water quality of Nakdong River is not good, so they boil or use home filter to drink tap water, which causes unnecessary individual costs to Busan citizens. We need to diversify water intake to reduce the cost and to change the weak water source. Under this background, this study was carried out for the environmental accounting of Namgang dam water treatment system compared to Nakdong River water treatment system by using emergy analysis method to help making reasonable decision. Emergy analysis method evaluates quantitatively both natural environment and human economic activities as an equal unit of measure. The emergy transformity of Namgang dam’s water was 1.16 times larger than that of Nakdong River’s water. Namgang Dam’s water shows larger emergy transformity than that of Nakdong River’s water due to its good water quality. The emergy used in making 1 m3 tap water from Namgang dam water treatment system was 1.26 times larger than that of Nakdong River water treatment system. Namgang dam water treatment system shows larger emergy input than that of Nakdong river water treatment system due to its construction cost of new pipeline for intaking Namgang daw water. If the Won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.66. If the Em-won used in making 1 m3 tap water from Nakdong river water treatment system is 1, Namgang dam water treatment system used 1.26. The cost-benefit ratio of Em-won was smaller than that of Won. When we use emergy analysis, which considers the benefit of a natural environment such as good water quality of Namgang dam, Namgang dam water treatment system could be a good alternative for diversifying intake source.Keywords: emergy, emergy transformity, Em-won, water treatment system
Procedia PDF Downloads 302210 A Novel Hybrid Deep Learning Architecture for Predicting Acute Kidney Injury Using Patient Record Data and Ultrasound Kidney Images
Authors: Sophia Shi
Abstract:
Acute kidney injury (AKI) is the sudden onset of kidney damage in which the kidneys cannot filter waste from the blood, requiring emergency hospitalization. AKI patient mortality rate is high in the ICU and is virtually impossible for doctors to predict because it is so unexpected. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. De-identified patient data from the MIMIC-III database and de-identified kidney images and corresponding patient records from the Beijing Hospital of the Ministry of Health were collected. Using data features including serum creatinine among others, two numeric models using MIMIC and Beijing Hospital data were built, and with the hospital ultrasounds, an image-only model was built. Convolutional neural networks (CNN) were used, VGG and Resnet for numeric data and Resnet for image data, and they were combined into a hybrid model by concatenating feature maps of both types of models to create a new input. This input enters another CNN block and then two fully connected layers, ending in a binary output after running through Softmax and additional code. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, achieving an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient’s admission to the ICU, so that doctors can take preventative measures and diminish mortality risks and severe kidney damage.Keywords: Acute kidney injury, Convolutional neural network, Hybrid deep learning, Patient record data, ResNet, Ultrasound kidney images, VGG
Procedia PDF Downloads 131209 Effects of Essential Oils on the Intestinal Microflora of Termite (Heterotermes indicola)
Authors: Ayesha Aihetasham, Najma Arshad, Sobia Khan
Abstract:
Damage causes by subterranean termites are of major concern today. Termites majorly treated with pesticides resulted in several problems related to health and environment. For this reason, plant-derived natural products specifically essential oils have been evaluated in order to control termites. The aim of the present study was to investigate the antitermitic potential of six essential oils on Heterotermes indicola subterranean termite. No-choice bioassay was used to assess the termiticidal action of essential oils. Further, gut from each set of treated termite group was extracted and analyzed for reduction in number of protozoa and bacteria by protozoal count method using haemocytometer and viable bacterial plate count (dilution method) respectively. In no-choice bioassay it was found that Foeniculum vulgare oil causes high degree of mortality 90 % average mortality at 10 mg oil concentration (10mg/0.42g weight of filter paper). Least mortality appeared to be due to Citrus sinensis oil (43.33 % average mortality at 10 mg/0.42g). The highest activity verified to be of Foeniculum vulgare followed by Eruca sativa, Trigonella foenum-graecum, Peganum harmala, Syzygium cumini and Citrus sinensis. The essential oil which caused maximum reduction in number of protozoa was P. harmala followed by T. foenum-graecum and E. sativa. In case of bacterial count E. sativa oil indicated maximum decrease in bacterial number (6.4×10⁹ CFU/ml). It is concluded that F. vulgare, E. sativa and P. harmala essential oils are highly effective against H. indicola termite and its gut microflora.Keywords: bacterial count, essential oils, Heterotermes indicola, protozoal count
Procedia PDF Downloads 243208 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 102207 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 41206 Quantitative Proteome Analysis and Bioactivity Testing of New Zealand Honeybee Venom
Authors: Maryam Ghamsari, Mitchell Nye-Wood, Kelvin Wang, Angela Juhasz, Michelle Colgrave, Don Otter, Jun Lu, Nazimah Hamid, Thao T. Le
Abstract:
Bee venom, a complex mixture of peptides, proteins, enzymes, and other bioactive compounds, has been widely studied for its therapeutic application. This study investigated the proteins present in New Zealand (NZ) honeybee venom (BV) using bottom-up proteomics. Two sample digestion techniques, in-solution digestion and filter-aided sample preparation (FASP), were employed to obtain the optimal method for protein digestion. Sequential Window Acquisition of All Theoretical Mass Spectra (SWATH–MS) analysis was conducted to quantify the protein compositions of NZ BV and investigate variations in collection years. Our results revealed high protein content (158.12 µg/mL), with the FASP method yielding a larger number of identified proteins (125) than in-solution digestion (95). SWATH–MS indicated melittin and phospholipase A2 as the most abundant proteins. Significant variations in protein compositions across samples from different years (2018, 2019, 2021) were observed, with implications for venom's bioactivity. In vitro testing demonstrated immunomodulatory and antioxidant activities, with a viable range for cell growth established at 1.5-5 µg/mL. The study underscores the value of proteomic tools in characterizing bioactive compounds in bee venom, paving the way for deeper exploration into their therapeutic potentials. Further research is needed to fractionate the venom and elucidate the mechanisms of action for the identified bioactive components.Keywords: honeybee venom, proteomics, bioactivity, fractionation, swath-ms, melittin, phospholipase a2, new zealand, immunomodulatory, antioxidant
Procedia PDF Downloads 38205 A Sub-Conjunctiva Injection of Rosiglitazone for Anti-Fibrosis Treatment after Glaucoma Filtration Surgery
Authors: Yang Zhao, Feng Zhang, Xuanchu Duan
Abstract:
Trans-differentiation of human Tenon fibroblasts (HTFs) to myo-fibroblasts and fibrosis of episcleral tissue are the most common reasons for the failure of glaucoma filtration surgery, with limited treatment options like antimetabolites which always have side-effects such as leakage of filter bulb, infection, hypotony, and endophthalmitis. Rosiglitazone, a specific thiazolidinedione is a synthetic high-affinity ligand for PPAR-r, which has been used in the treatment of type2 diabetes, and found to have pleiotropic functions against inflammatory response, cell proliferation and tissue fibrosis and to benefit to a variety of diseases in animal myocardium models, steatohepatitis models, etc. Here, in vitro we cultured primary HTFs and stimulated with TGF- β to induced myofibrogenic, then treated cells with Rosiglitazone to assess for fibrogenic response. In vivo, we used rabbit glaucoma model to establish the formation of post- trabeculectomy scarring. Then we administered subconjunctival injection with Rosiglitazone beside the filtering bleb, later protein, mRNA and immunofluorescence of fibrogenic markers are checked, and filtering bleb condition was measured. In vitro, we found Rosiglitazone could suppressed proliferation and migration of fibroblasts through macroautophagy via TGF- β /Smad signaling pathway. In vivo, on postoperative day 28, the mean number of fibroblasts in Rosiglitazone injection group was significantly the lowest and had the least collagen content and connective tissue growth factor. Rosiglitazone effectively controlled human and rabbit fibroblasts in vivo and in vitro. Its subconjunctiiva application may represent an effective, new avenue for the prevention of scarring after glaucoma surgery.Keywords: fibrosis, glaucoma, macroautophagy, rosiglitazone
Procedia PDF Downloads 270