Search results for: signal prediction
924 The Influence of Caregivers’ Preparedness and Role Burden on Quality of Life among Stroke Patients
Authors: Yeaji Seok, Myung Kyung Lee
Abstract:
Background: Even if patients survive after a stroke, stroke patients may experience disability in mobility, sensation, cognition, and speech and language. Stroke patients require rehabilitation for functional recovery and daily life for a considerable time. During rehabilitation, the role of caregivers is important. However, the stroke patients’ quality of life may deteriorate due to family caregivers’ non-preparedness and increased role burden. Purpose: To investigate the prediction of caregivers' preparedness and role burden on stroke patients’ quality of life. Methods: The target population was stroke patients who were hospitalized for rehabilitation and their family care providers. A total of 153 patient-family caregiver dyads were recruited from June to August 2021. Data were collected from self-reported questionnaires and analyzed using descriptive statistics, t-tests, chi-squared test, one-way analysis of variance, Pearson’s correlation coefficients, and multiple regression with SPSS statistics 28 programs. Results: Family caregivers’ preparedness affected stroke patients’ mobility (β = .20, p < 0.05) and character (β = -.084, p < 0.05) and production activities (β = -.197, p < 0.05) in quality of life. The role burden of family caregivers affected language skills (β = .310, p<0.05), visual functions (β=-.357, p < 0.05), thinking skills (β = 0.443, p = 0.05), mood conditions (β = 0.565, p < 0.001), family roles (β = -0.361, p < 0.001), and social roles (β = -0.304, p < 0.001), while the caregivers’ burden of performing self-protection negatively affected patients’ social roles (β = .180, p=.048). In addition, caregivers’ role burden of personal life sacrifice affected patients’ mobility (β = .311, p < 0.05), self-care (β =.232, p < 0.05) and energy (β = .239, p < 0.05). Conclusion: This study indicated that family caregivers' preparedness and role burden affected stroke patients’ quality of life. The results of this study suggested that intervention to improve family caregivers’ preparedness and to reduce role burden should be required for quality of life in stroke patients.Keywords: quality of life, preparedness, role burden, caregivers, stroke
Procedia PDF Downloads 210923 Prediction of Antibacterial Peptides against Propionibacterium acnes from the Peptidomes of Achatina fulica Mucus Fractions
Authors: Suwapitch Chalongkulasak, Teerasak E-Kobon, Pramote Chumnanpuen
Abstract:
Acne vulgaris is a common skin disease mainly caused by the Gram–positive pathogenic bacterium, Propionibacterium acnes. This bacterium stimulates inflammation process in human sebaceous glands. Giant African snail (Achatina fulica) is alien species that rapidly reproduces and seriously damages agricultural products in Thailand. There were several research reports on the medical and pharmaceutical benefits of this snail mucus peptides and proteins. This study aimed to in silico predict multifunctional bioactive peptides from A. fulica mucus peptidome using several bioinformatic tools for determination of antimicrobial (iAMPpred), anti–biofilm (dPABBs), cytotoxic (Toxinpred), cell membrane penetrating (CPPpred) and anti–quorum sensing (QSPpred) peptides. Three candidate peptides with the highest predictive score were selected and re-designed/modified to improve the required activities. Structural and physicochemical properties of six anti–P. acnes (APA) peptide candidates were performed by PEP–FOLD3 program and the five aforementioned tools. All candidates had random coiled structure and were named as APA1–ori, APA2–ori, APA3–ori, APA1–mod, APA2–mod and APA3–mod. To validate the APA activity, these peptide candidates were synthesized and tested against six isolates of P. acnes. The modified APA peptides showed high APA activity on some isolates. Therefore, our biomimetic mucus peptides could be useful for preventing acne vulgaris and further examined on other activities important to medical and pharmaceutical applications.Keywords: Propionibacterium acnes, Achatina fulica, peptidomes, antibacterial peptides, snail mucus
Procedia PDF Downloads 133922 Fire and Explosion Consequence Modeling Using Fire Dynamic Simulator: A Case Study
Authors: Iftekhar Hassan, Sayedil Morsalin, Easir A Khan
Abstract:
Accidents involving fire occur frequently in recent times and their causes showing a great deal of variety which require intervention methods and risk assessment strategies are unique in each case. On September 4, 2020, a fire and explosion occurred in a confined space caused by a methane gas leak from an underground pipeline in Baitus Salat Jame mosque during Night (Esha) prayer in Narayanganj District, Bangladesh that killed 34 people. In this research, this incident is simulated using Fire Dynamics Simulator (FDS) software to analyze and understand the nature of the accident and associated consequences. FDS is an advanced computational fluid dynamics (CFD) system of fire-driven fluid flow which solves numerically a large eddy simulation form of the Navier–Stokes’s equations for simulation of the fire and smoke spread and prediction of thermal radiation, toxic substances concentrations and other relevant parameters of fire. This study focuses on understanding the nature of the fire and consequence evaluation due to thermal radiation caused by vapor cloud explosion. An evacuation modeling was constructed to visualize the effect of evacuation time and fractional effective dose (FED) for different types of agents. The results were presented by 3D animation, sliced pictures and graphical representation to understand fire hazards caused by thermal radiation or smoke due to vapor cloud explosion. This study will help to design and develop appropriate respond strategy for preventing similar accidents.Keywords: consequence modeling, fire and explosion, fire dynamics simulation (FDS), thermal radiation
Procedia PDF Downloads 225921 Conjugated Chitosan-Carboxymethyl-5-Fluorouracil Nanoparticles for Skin Delivery
Authors: Mazita Mohd Diah, Anton V. Dolzhenko, Tin Wui Wong
Abstract:
Nanoparticles, being small with a large specific surface area, increase solubility, enhance bioavailability, improve controlled release and enable precision targeting of the entrapped compounds. In this study, chitosan as polymeric permeation enhancer was conjugated to a polar pro-drug, carboxymethyl-5-fluorouracil (CMFU) to increase the skin drug permeation. Chitosan-CMFU conjugate was synthesized using chemical conjugation process through succinate linker. It was then transformed into nanoparticles via spray drying method. The conjugation was elucidated using Fourier Transform Infrared and Proton Nuclear Magnetic Resonance techniques. The nanoparticle size, size distribution, zeta potential, drug content, skin permeation and retention profiles were characterized. The conjugation was denoted using 1H NMR by new peaks at signal δ = 4.184 ppm (singlet, 2H for CH2) and 7.676-7.688 ppm (doublet, 1H for C6) attributed to CMFU in chitosan-CMFU NMR spectrum. The nanoparticles had profiles of particle size: 93.97 ±35.11 nm, polydispersity index: 0.40 ± 0.14, zeta potential: +18.25 ±2.95 mV and drug content: 6.20 ± 1.98 % w/w. Almost 80 % w/w CMFU in the form of nanoparticles permeated through the skin in 24 hours and close to 50 % w/w permeation occurred in first 1-2 hours. Without conjugation to chitosan and nanoparticulation, less than 40 % w/w CMFU permeated through the skin in 24 hours. The skin drug retention likewise was higher with chitosan-CMFU nanoparticles (15.34 ± 5.82 % w/w) than CMFU (2.24 ± 0.57 % w/w). CMFU, through conjugation with chitosan permeation enhancer and processed in nanogeometry, had its skin permeation and retention degree promoted.Keywords: carboxymethyl-5-fluorouracil, chitosan, conjugate, skin permeation, skin retention
Procedia PDF Downloads 365920 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 62919 Novel Correlations for P-Substituted Phenols in NMR Spectroscopy
Authors: Khodzhaberdi Allaberdiev
Abstract:
Substituted phenols are widely used for the synthesis of advanced polycondensation polymers. In terms of the structure regularity and practical value of obtained polymers are of special interest the p-substituted phenols. The lanthanide induced shifts (LIS) of the aromatic ring and the OH protons by addition Eu(fod)3 to various p-substituted phenols in CDCL3 solvent were measured Nuclear Magnetic Resonance spectroscopy. A linear relationship has been observed between the LIS of protons (∆=δcomplex –δsubstrate) and Eu(fod)3/substrate molar ratios. The LIS protons of the investigated phenols decreases in the following order: ОН > ortho > meta. The LIS of these protons also depends on both steric and electronic effects of p-substituents. The effect on the LIS of protons steric hindrance of substituents by way of example p-substituted alkyl phenols was studied. Alkyl phenols exhibit pronounced europium- induced shifts, their sensitivity increasing in the order: CH3 > C2H5 > sym-C5H11 > tert-C5H11 > tert-C4H9, i.e. in parallel with decreasing steric hindrance. The influence steric hindrance p-substituents of phenols on the LIS of protons in sequence following decreases: OH> meta >ortho. Contrary to the expectations, it is found that the LIS of the ortho protons an excellent linear correlation with meta-substituent constants, σm for 14 p-substituted phenols: ∆H2, 6=8.165-9.896 σm (r2=0,999). Moreover, a linear correlation between the LIS of the ortho protons and ionization constants, РКa of p-substituted phenols has been revealed. Similarly, the linear relationships for the LIS of the meta and the OH protons were obtained. Use the LIS of the phenolic hydroxyl groups for linear relationships is necessary with care, because of the signal broadening of the OH protons. New constants may be determinate with unusual case by this approach.Keywords: novel correlations, NMR spectroscopy, phenols, shift reagent
Procedia PDF Downloads 301918 Modified Clusterwise Regression for Pavement Management
Authors: Mukesh Khadka, Alexander Paz, Hanns de la Fuente-Mella
Abstract:
Typically, pavement performance models are developed in two steps: (i) pavement segments with similar characteristics are grouped together to form a cluster, and (ii) the corresponding performance models are developed using statistical techniques. A challenge is to select the characteristics that define clusters and the segments associated with them. If inappropriate characteristics are used, clusters may include homogeneous segments with different performance behavior or heterogeneous segments with similar performance behavior. Prediction accuracy of performance models can be improved by grouping the pavement segments into more uniform clusters by including both characteristics and a performance measure. This grouping is not always possible due to limited information. It is impractical to include all the potential significant factors because some of them are potentially unobserved or difficult to measure. Historical performance of pavement segments could be used as a proxy to incorporate the effect of the missing potential significant factors in clustering process. The current state-of-the-art proposes Clusterwise Linear Regression (CLR) to determine the pavement clusters and the associated performance models simultaneously. CLR incorporates the effect of significant factors as well as a performance measure. In this study, a mathematical program was formulated for CLR models including multiple explanatory variables. Pavement data collected recently over the entire state of Nevada were used. International Roughness Index (IRI) was used as a pavement performance measure because it serves as a unified standard that is widely accepted for evaluating pavement performance, especially in terms of riding quality. Results illustrate the advantage of the using CLR. Previous studies have used CLR along with experimental data. This study uses actual field data collected across a variety of environmental, traffic, design, and construction and maintenance conditions.Keywords: clusterwise regression, pavement management system, performance model, optimization
Procedia PDF Downloads 251917 A Cephalometric Superimposition of a Skeletal Class III Orthognathic Patient on Nasion-Sella Line
Authors: Albert Suryaprawira
Abstract:
The Nasion-Sella Line (NSL) has been used for several years as a reference line in longitudinal growth study. Therefore this line is considered to be stable not only to evaluate treatment outcome and to predict relapse possibility but also to manage prognosis. This is a radiographic superimposition of an adult male aged 19 years who complained of difficulty in aesthetic, talking and chewing. Patient has a midface hypoplasia profile (concave). He was diagnosed to have a severe Skeletal Class III with Class III malocclusion, increased lower vertical height, and an anterior open bite. A pre-treatment cephalometric radiograph was taken to analyse the skeletal problem and to measure the amount of bone movement and the prediction soft tissue response. A panoramic radiograph was also taken to analyse bone quality, bone abnormality, third molar impaction, etc. Before the surgery, a pre-surgical cephalometric radiograph was taken to re-evaluate the plan and to settle the final amount of bone cut. After the surgery, a post-surgical cephalometric radiograph was taken to confirm the result with the plan. The superimposition using NSL as a reference line between those radiographs was performed to analyse the outcome. It is important to describe the amount of hard and soft tissue movement and to predict the possibility of relapse after the surgery. The patient also needs to understand all the surgical plan, outcome and relapse prevention. The surgical management included maxillary impaction and advancement of Le Fort I osteotomy. The evaluation using NSL as a reference was a very useful method in determining the outcome and prognosis.Keywords: Nasion-Sella Line, midface hypoplasia, Le Fort 1, maxillary advancement
Procedia PDF Downloads 142916 External Store Safe Separation Evaluation Process Implementing CFD and MIL-HDBK-1763
Authors: Thien Bach Nguyen, Nhu-Van Nguyen, Phi-Minh Nguyen, Minh Hien Dao
Abstract:
The external store safe separation evaluation process implementing CFD and MIL-HDBK-1763 is proposed to support the evaluation and compliance of the external store safe separation with the extensive using CFD and the criteria from MIL-HDBK-1763. The criteria of safe separation are researched and investigated for the various standards and handbooks such as MIL-HDBK-1763, MIL-HDBK-244A, AGARD-AG-202 and AGARD-AG-300 to acquire the appropriate and tailored values and limits for the typical applications of external carriages and aircraft fighters. The CFD and 6DOF simulations are extensively used in ANSYS 2023 R1 Software for verification and validation of moving unstructured meshes and solvers by calibrating the position, aerodynamic forces and moments of the existing air-to-ground missile models. The verified CFD and 6DoF simulation separation process is applied and implemented for the investigation of the typical munition separation phenomena and compliance with the tailored requirements of MIL-HDBK-1763. The prediction of munition trajectory parameters under aircraft aerodynamics interference and specified rack unit consideration after munition separation is provided and complied with the tailored requirements to support the safe separation evaluation of improved and newly external store munition before the flight test performed. The proposed process demonstrates the effectiveness and reliability in providing the understanding of the complicated store separation and the reduction of flight test sorties during the improved and new munition development projects by extensively using the CFD and tailoring the existing standards.Keywords: external store separation, MIL-HDBK-1763, CFD, moving meshes, flight test data, munition.
Procedia PDF Downloads 23915 Development of an Implicit Physical Influence Upwind Scheme for Cell-Centered Finite Volume Method
Authors: Shidvash Vakilipour, Masoud Mohammadi, Rouzbeh Riazi, Scott Ormiston, Kimia Amiri, Sahar Barati
Abstract:
An essential component of a finite volume method (FVM) is the advection scheme that estimates values on the cell faces based on the calculated values on the nodes or cell centers. The most widely used advection schemes are upwind schemes. These schemes have been developed in FVM on different kinds of structured and unstructured grids. In this research, the physical influence scheme (PIS) is developed for a cell-centered FVM that uses an implicit coupled solver. Results are compared with the exponential differencing scheme (EDS) and the skew upwind differencing scheme (SUDS). Accuracy of these schemes is evaluated for a lid-driven cavity flow at Re = 1000, 3200, and 5000 and a backward-facing step flow at Re = 800. Simulations show considerable differences between the results of EDS scheme with benchmarks, especially for the lid-driven cavity flow at high Reynolds numbers. These differences occur due to false diffusion. Comparing SUDS and PIS schemes shows relatively close results for the backward-facing step flow and different results in lid-driven cavity flow. The poor results of SUDS in the lid-driven cavity flow can be related to its lack of sensitivity to the pressure difference between cell face and upwind points, which is critical for the prediction of such vortex dominant flows.Keywords: cell-centered finite volume method, coupled solver, exponential differencing scheme (EDS), physical influence scheme (PIS), pressure weighted interpolation method (PWIM), skew upwind differencing scheme (SUDS)
Procedia PDF Downloads 284914 Statistical Model of Water Quality in Estero El Macho, Machala-El Oro
Authors: Rafael Zhindon Almeida
Abstract:
Surface water quality is an important concern for the evaluation and prediction of water quality conditions. The objective of this study is to develop a statistical model that can accurately predict the water quality of the El Macho estuary in the city of Machala, El Oro province. The methodology employed in this study is of a basic type that involves a thorough search for theoretical foundations to improve the understanding of statistical modeling for water quality analysis. The research design is correlational, using a multivariate statistical model involving multiple linear regression and principal component analysis. The results indicate that water quality parameters such as fecal coliforms, biochemical oxygen demand, chemical oxygen demand, iron and dissolved oxygen exceed the allowable limits. The water of the El Macho estuary is determined to be below the required water quality criteria. The multiple linear regression model, based on chemical oxygen demand and total dissolved solids, explains 99.9% of the variance of the dependent variable. In addition, principal component analysis shows that the model has an explanatory power of 86.242%. The study successfully developed a statistical model to evaluate the water quality of the El Macho estuary. The estuary did not meet the water quality criteria, with several parameters exceeding the allowable limits. The multiple linear regression model and principal component analysis provide valuable information on the relationship between the various water quality parameters. The findings of the study emphasize the need for immediate action to improve the water quality of the El Macho estuary to ensure the preservation and protection of this valuable natural resource.Keywords: statistical modeling, water quality, multiple linear regression, principal components, statistical models
Procedia PDF Downloads 98913 Uncertainty in Near-Term Global Surface Warming Linked to Pacific Trade Wind Variability
Authors: M. Hadi Bordbar, Matthew England, Alex Sen Gupta, Agus Santoso, Andrea Taschetto, Thomas Martin, Wonsun Park, Mojib Latif
Abstract:
Climate models generally simulate long-term reductions in the Pacific Walker Circulation with increasing atmospheric greenhouse gases. However, over two recent decades (1992-2011) there was a strong intensification of the Pacific Trade Winds that is linked with a slowdown in global surface warming. Using large ensembles of multiple climate models forced by increasing atmospheric greenhouse gas concentrations and starting from different ocean and/or atmospheric initial conditions, we reveal very diverse 20-year trends in the tropical Pacific climate associated with a considerable uncertainty in the globally averaged surface air temperature (SAT) in each model ensemble. This result suggests low confidence in our ability to accurately predict SAT trends over 20-year timescale only from external forcing. We show, however, that the uncertainty can be reduced when the initial oceanic state is adequately known and well represented in the model. Our analyses suggest that internal variability in the Pacific trade winds can mask the anthropogenic signal over a 20-year time frame, and drive transitions between periods of accelerated global warming and temporary slowdown periods.Keywords: trade winds, walker circulation, hiatus in the global surface warming, internal climate variability
Procedia PDF Downloads 268912 X-Ray Dosimetry by a Low-Cost Current Mode Ion Chamber
Authors: Ava Zarif Sanayei, Mustafa Farjad-Fard, Mohammad-Reza Mohammadian-Behbahani, Leyli Ebrahimi, Sedigheh Sina
Abstract:
The fabrication and testing of a low-cost air-filled ion chamber for X-ray dosimetry is studied. The chamber is made of a metal cylinder, a central wire, a BC517 Darlington transistor, a 9V DC battery, and a voltmeter in order to have a cost-effective means to measure the dose. The output current of the dosimeter is amplified by the transistor and then fed to the large internal resistance of the voltmeter, producing a readable voltage signal. The dose-response linearity of the ion chamber is evaluated for different exposure scenarios by the X-ray tube. kVp values 70, 90, and 120, and mAs up to 20 are considered. In all experiments, a solid-state dosimeter (Solidose 400, Elimpex Medizintechnik) is used as a reference device for chamber calibration. Each case of exposure is repeated three times, the voltmeter and Solidose readings are recorded, and the mean and standard deviation values are calculated. Then, the calibration curve, derived by plotting voltmeter readings against Solidose readings, provided a linear fit result for all tube kVps of 70, 90, and 120. A 99, 98, and 100% linear relationship, respectively, for kVp values 70, 90, and 120 are demonstrated. The study shows the feasibility of achieving acceptable dose measurements with a simplified setup. Further enhancements to the proposed setup include solutions for limiting the leakage current, optimizing chamber dimensions, utilizing electronic microcontrollers for dedicated data readout, and minimizing the impact of stray electromagnetic fields on the system.Keywords: dosimetry, ion chamber, radiation detection, X-ray
Procedia PDF Downloads 77911 Numerical Investigation of the Transverse Instability in Radiation Pressure Acceleration
Authors: F. Q. Shao, W. Q. Wang, Y. Yin, T. P. Yu, D. B. Zou, J. M. Ouyang
Abstract:
The Radiation Pressure Acceleration (RPA) mechanism is very promising in laser-driven ion acceleration because of high laser-ion energy conversion efficiency. Although some experiments have shown the characteristics of RPA, the energy of ions is quite limited. The ion energy obtained in experiments is only several MeV/u, which is much lower than theoretical prediction. One possible limiting factor is the transverse instability incited in the RPA process. The transverse instability is basically considered as the Rayleigh-Taylor (RT) instability, which is a kind of interfacial instability and occurs when a light fluid pushes against a heavy fluid. Multi-dimensional particle-in-cell (PIC) simulations show that the onset of transverse instability will destroy the acceleration process and broaden the energy spectrum of fast ions during the RPA dominant ion acceleration processes. The evidence of the RT instability driven by radiation pressure has been observed in a laser-foil interaction experiment in a typical RPA regime, and the dominant scale of RT instability is close to the laser wavelength. The development of transverse instability in the radiation-pressure-acceleration dominant laser-foil interaction is numerically examined by two-dimensional particle-in-cell simulations. When a laser interacts with a foil with modulated surface, the internal instability is quickly incited and it develops. The linear growth and saturation of the transverse instability are observed, and the growth rate is numerically diagnosed. In order to optimize interaction parameters, a method of information entropy is put forward to describe the chaotic degree of the transverse instability. With moderate modulation, the transverse instability shows a low chaotic degree and a quasi-monoenergetic proton beam is produced.Keywords: information entropy, radiation pressure acceleration, Rayleigh-Taylor instability, transverse instability
Procedia PDF Downloads 345910 Localization of Pyrolysis and Burning of Ground Forest Fires
Authors: Pavel A. Strizhak, Geniy V. Kuznetsov, Ivan S. Voytkov, Dmitri V. Antonov
Abstract:
This paper presents the results of experiments carried out at a specialized test site for establishing macroscopic patterns of heat and mass transfer processes at localizing model combustion sources of ground forest fires with the use of barrier lines in the form of a wetted lay of material in front of the zone of flame burning and thermal decomposition. The experiments were performed using needles, leaves, twigs, and mixtures thereof. The dimensions of the model combustion source and the ranges of heat release correspond well to the real conditions of ground forest fires. The main attention is paid to the complex analysis of the effect of dispersion of water aerosol (concentration and size of droplets) used to form the barrier line. It is shown that effective conditions for localization and subsequent suppression of flame combustion and thermal decomposition of forest fuel can be achieved by creating a group of barrier lines with different wetting width and depth of the material. Relative indicators of the effectiveness of one and combined barrier lines were established, taking into account all the main characteristics of the processes of suppressing burning and thermal decomposition of forest combustible materials. We performed the prediction of the necessary and sufficient parameters of barrier lines (water volume, width, and depth of the wetted lay of the material, specific irrigation density) for combustion sources with different dimensions, corresponding to the real fire extinguishing practice.Keywords: forest fire, barrier water lines, pyrolysis front, flame front
Procedia PDF Downloads 132909 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy
Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid
Abstract:
Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure
Procedia PDF Downloads 495908 Dynamic Analysis of the Heat Transfer in the Magnetically Assisted Reactor
Authors: Tomasz Borowski, Dawid Sołoducha, Rafał Rakoczy, Marian Kordas
Abstract:
The application of magnetic field is essential for a wide range of technologies or processes (i.e., magnetic hyperthermia, bioprocessing). From the practical point of view, bioprocess control is often limited to the regulation of temperature at constant values favourable to microbial growth. The main aim of this study is to determine the effect of various types of electromagnetic fields (i.e., static or alternating) on the heat transfer in a self-designed magnetically assisted reactor. The experimental set-up is equipped with a measuring instrument which controlled the temperature of the liquid inside the container and supervised the real-time acquisition of all the experimental data coming from the sensors. Temperature signals are also sampled from generator of magnetic field. The obtained temperature profiles were mathematically described and analyzed. The parameters characterizing the response to a step input of a first-order dynamic system were obtained and discussed. For example, the higher values of the time constant means slow signal (in this case, temperature) increase. After the period equal to about five-time constants, the sample temperature nearly reached the asymptotic value. This dynamical analysis allowed us to understand the heating effect under the action of various types of electromagnetic fields. Moreover, the proposed mathematical description can be used to compare the influence of different types of magnetic fields on heat transfer operations.Keywords: heat transfer, magnetically assisted reactor, dynamical analysis, transient function
Procedia PDF Downloads 171907 PAPR Reduction of FBMC Using Sliding Window Tone Reservation Active Constellation Extension Technique
Authors: S. Anuradha, V. Sandeep Kumar
Abstract:
The high Peak to Average Power Ratio (PAR) in Filter Bank Multicarrier with Offset Quadrature Amplitude Modulation (FBMC-OQAM) can significantly reduce power efficiency and performance. In this paper, we address the problem of PAPR reduction for FBMC-OQAM systems using Tone Reservation (TR) technique. Due to the overlapping structure of FBMCOQAM signals, directly applying TR schemes of OFDM systems to FBMC-OQAM systems is not effective. We improve the tone reservation (TR) technique by employing sliding window with Active Constellation Extension for the PAPR reduction of FBMC-OQAM signals, called sliding window tone reservation Active Constellation Extension (SW-TRACE) technique. The proposed SW-TRACE technique uses the peak reduction tones (PRTs) of several consecutive data blocks to cancel the peaks of the FBMC-OQAM signal inside a window, with dynamically extending outer constellation points in active(data-carrying) channels, within margin-preserving constraints, in order to minimize the peak magnitude. Analysis and simulation results compared to the existing Tone Reservation (TR) technique for FBMC/OQAM system. The proposed method SW-TRACE has better PAPR performance and lower computational complexity.Keywords: FBMC-OQAM, peak-to-average power ratio, sliding window, tone reservation Active Constellation Extension
Procedia PDF Downloads 447906 A Deep Learning Approach to Calculate Cardiothoracic Ratio From Chest Radiographs
Authors: Pranav Ajmera, Amit Kharat, Tanveer Gupte, Richa Pant, Viraj Kulkarni, Vinay Duddalwar, Purnachandra Lamghare
Abstract:
The cardiothoracic ratio (CTR) is the ratio of the diameter of the heart to the diameter of the thorax. An abnormal CTR, that is, a value greater than 0.55, is often an indicator of an underlying pathological condition. The accurate prediction of an abnormal CTR from chest X-rays (CXRs) aids in the early diagnosis of clinical conditions. We propose a deep learning-based model for automatic CTR calculation that can assist the radiologist with the diagnosis of cardiomegaly and optimize the radiology flow. The study population included 1012 posteroanterior (PA) CXRs from a single institution. The Attention U-Net deep learning (DL) architecture was used for the automatic calculation of CTR. A CTR of 0.55 was used as a cut-off to categorize the condition as cardiomegaly present or absent. An observer performance test was conducted to assess the radiologist's performance in diagnosing cardiomegaly with and without artificial intelligence (AI) assistance. The Attention U-Net model was highly specific in calculating the CTR. The model exhibited a sensitivity of 0.80 [95% CI: 0.75, 0.85], precision of 0.99 [95% CI: 0.98, 1], and a F1 score of 0.88 [95% CI: 0.85, 0.91]. During the analysis, we observed that 51 out of 1012 samples were misclassified by the model when compared to annotations made by the expert radiologist. We further observed that the sensitivity of the reviewing radiologist in identifying cardiomegaly increased from 40.50% to 88.4% when aided by the AI-generated CTR. Our segmentation-based AI model demonstrated high specificity and sensitivity for CTR calculation. The performance of the radiologist on the observer performance test improved significantly with AI assistance. A DL-based segmentation model for rapid quantification of CTR can therefore have significant potential to be used in clinical workflows.Keywords: cardiomegaly, deep learning, chest radiograph, artificial intelligence, cardiothoracic ratio
Procedia PDF Downloads 98905 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-Fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.Keywords: fact checking, claim verification, deep learning, natural language processing
Procedia PDF Downloads 62904 Forecasting Nokoué Lake Water Levels Using Long Short-Term Memory Network
Authors: Namwinwelbere Dabire, Eugene C. Ezin, Adandedji M. Firmin
Abstract:
The prediction of hydrological flows (rainfall-depth or rainfall-discharge) is becoming increasingly important in the management of hydrological risks such as floods. In this study, the Long Short-Term Memory (LSTM) network, a state-of-the-art algorithm dedicated to time series, is applied to predict the daily water level of Nokoue Lake in Benin. This paper aims to provide an effective and reliable method enable of reproducing the future daily water level of Nokoue Lake, which is influenced by a combination of two phenomena: rainfall and river flow (runoff from the Ouémé River, the Sô River, the Porto-Novo lagoon, and the Atlantic Ocean). Performance analysis based on the forecasting horizon indicates that LSTM can predict the water level of Nokoué Lake up to a forecast horizon of t+10 days. Performance metrics such as Root Mean Square Error (RMSE), coefficient of correlation (R²), Nash-Sutcliffe Efficiency (NSE), and Mean Absolute Error (MAE) agree on a forecast horizon of up to t+3 days. The values of these metrics remain stable for forecast horizons of t+1 days, t+2 days, and t+3 days. The values of R² and NSE are greater than 0.97 during the training and testing phases in the Nokoué Lake basin. Based on the evaluation indices used to assess the model's performance for the appropriate forecast horizon of water level in the Nokoué Lake basin, the forecast horizon of t+3 days is chosen for predicting future daily water levels.Keywords: forecasting, long short-term memory cell, recurrent artificial neural network, Nokoué lake
Procedia PDF Downloads 64903 Quantitative Analysis of Caffeine in Pharmaceutical Formulations Using a Cost-Effective Electrochemical Sensor
Authors: Y. T. Gebreslassie, Abrha Tadesse, R. C. Saini, Rishi Pal
Abstract:
Caffeine, known chemically as 3,7-dihydro-1,3,7-trimethyl-1H-purine-2,6-dione, is a naturally occurring alkaloid classified as an N-methyl derivative of xanthine. Given its widespread use in coffee and other caffeine-containing products, it is the most commonly consumed psychoactive substance in everyday human life. This research aimed to develop a cost-effective, sensitive, and easily manufacturable sensor for the detection of caffeine. Antraquinone-modified carbon paste electrode (AQMCPE) was fabricated, and the electrochemical behavior of caffeine on this electrode was investigated using cyclic voltammetry (CV) and square wave voltammetry (SWV) in a solution of 0.1M perchloric acid at pH 0.56. The modified electrode displayed enhanced electrocatalytic activity towards caffeine oxidation, exhibiting a two-fold increase in peak current and an 82 mV shift of the peak potential in the negative direction compared to an unmodified carbon paste electrode (UMCPE). Exploiting the electrocatalytic properties of the modified electrode, SWV was employed for the quantitative determination of caffeine. Under optimized experimental conditions, a linear relationship between peak current and concentration was observed within the range of 2.0 x 10⁻⁶ to 1.0× 10⁻⁴ M, with a correlation coefficient of 0.998 and a detection limit of 1.47× 10⁻⁷ M (signal-to-noise ratio = 3). Finally, the proposed method was successfully applied to the quantitative analysis of caffeine in pharmaceutical formulations, yielding recovery percentages ranging from 95.27% to 106.75%.Keywords: antraquinone-modified carbon paste electrode, caffeine, detection, electrochemical sensor, quantitative analysis
Procedia PDF Downloads 65902 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 206901 Hyaluronic Acid Binding to Link Domain of Stabilin-2 Receptor
Authors: Aleksandra Twarda, Dobrosława Krzemień, Grzegorz Dubin, Tad A. Holak
Abstract:
Stabilin-2 belongs to the group of scavenger receptors and plays a crucial role in clearance of more than 10 ligands from the bloodstream, including hyaluronic acid, products of degradation of extracellular matrix and metabolic products. The Link domain, a defining feature of stabilin-2, has a sequence similar to Link domains in other hyaluronic acid receptors, such as CD44 or TSG-6, and is responsible for most of ligands binding. Present knowledge of signal transduction by stabilin-2, as well as ligands’ recognition and binding mechanism, is limited. Until now, no experimental structures have been solved for any segments of stabilin-2. It has recently been demonstrated that the stabilin-2 knock-out or blocking of the receptor by an antibody effectively opposes cancer metastasis by elevating the level of circulating hyaluronic acid. Moreover, loss of expression of stabilin-2 in a peri-tumourous liver correlates with increased survival. Solving of the crystal structure of stabilin-2 and elucidation of the binding mechanism of hyaluronic acid could enable the precise characterization of the interactions in the binding site. These results may allow for designing specific small-molecule inhibitors of stabilin-2 that could be used in cancer therapy. To carry out screening for crystallization of stabilin-2, we cloned constructs of the Link domain of various lengths with or without surrounding domains. The folding properties of the constructs were checked by nuclear magnetic resonance (NMR). It is planned to show the binding of hyaluronic acid to the Link domain using several biochemical methods, i.a. NMR, isothermal titration calorimetry and fluorescence polarization assay.Keywords: stabilin-2, Link domain, X-ray crystallography, NMR, hyaluronic acid, cancer
Procedia PDF Downloads 403900 Current Approach in Biodosimetry: Electrochemical Detection of DNA Damage
Authors: Marcela Jelicova, Anna Lierova, Zuzana Sinkorova, Radovan Metelka
Abstract:
At present, electrochemical methods are used in various research fields, especially for analysis of biological molecules. The fact offers the possibility of using the detection of oxidative damage induced indirectly by γ rays in DNA in biodosimentry. The main goal of our study is to optimize the detection of 8-hydroxyguanine by differential pulse voltammetry. The level of this stable and specific indicator of DNA damage could be determined in DNA isolated from peripheral blood lymphocytes, plasma or urine of irradiated individuals. Screen-printed carbon electrodes modified with carboxy-functionalized multi-walled carbon nanotubes were utilized for highly sensitive electrochemical detection of 8-hydroxyguanine. Electrochemical oxidation of 8-hydroxoguanine monitored by differential pulse voltammetry was found pH-dependent and the most intensive signal was recorded at pH 7. After recalculating the current density, several times higher sensitivity was attained in comparison with already published results, which were obtained using screen-printed carbon electrodes with unmodified carbon ink. Subsequently, the modified electrochemical technique was used for the detection of 8-hydroxoguanine in calf thymus DNA samples irradiated by 60Co gamma source in the dose range from 0.5 to 20 Gy using by various types of sample pretreatment and measurement conditions. This method could serve for fast retrospective quantification of absorbed dose in cases of accidental exposure to ionizing radiation and may play an important role in biodosimetry.Keywords: biodosimetry, electrochemical detection, voltametry, 8-hydroxyguanine
Procedia PDF Downloads 274899 Determination of Tide Height Using Global Navigation Satellite Systems (GNSS)
Authors: Faisal Alsaaq
Abstract:
Hydrographic surveys have traditionally relied on the availability of tide information for the reduction of sounding observations to a common datum. In most cases, tide information is obtained from tide gauge observations and/or tide predictions over space and time using local, regional or global tide models. While the latter often provides a rather crude approximation, the former relies on tide gauge stations that are spatially restricted, and often have sparse and limited distribution. A more recent method that is increasingly being used is Global Navigation Satellite System (GNSS) positioning which can be utilised to monitor height variations of a vessel or buoy, thus providing information on sea level variations during the time of a hydrographic survey. However, GNSS heights obtained under the dynamic environment of a survey vessel are affected by “non-tidal” processes such as wave activity and the attitude of the vessel (roll, pitch, heave and dynamic draft). This research seeks to examine techniques that separate the tide signal from other non-tidal signals that may be contained in GNSS heights. This requires an investigation of the processes involved and their temporal, spectral and stochastic properties in order to apply suitable recovery techniques of tide information. In addition, different post-mission and near real-time GNSS positioning techniques will be investigated with focus on estimation of height at ocean. Furthermore, the study will investigate the possibility to transfer the chart datums at the location of tide gauges.Keywords: hydrography, GNSS, datum, tide gauge
Procedia PDF Downloads 263898 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 76897 Magnetic Resonance Imaging in Children with Brain Tumors
Authors: J. R. Ashrapov, G. A. Alihodzhaeva, D. E. Abdullaev, N. R. Kadirbekov
Abstract:
Diagnosis of brain tumors is one of the challenges, as several central nervous system diseases run the same symptoms. Modern diagnostic techniques such as CT, MRI helps to significantly improve the surgery in the operating period, after surgery, after allowing time to identify postoperative complications in neurosurgery. Purpose: To study the MRI characteristics and localization of brain tumors in children and to detect the postoperative complications in the postoperative period. Materials and methods: A retrospective study of treatment of 62 children with brain tumors in age from 2 to 5 years was performed. Results of the review: MRI scan of the brain of the 62 patients 52 (83.8%) case revealed a brain tumor. Distribution on MRI of brain tumors found in 15 (24.1%) - glioblastomas, 21 (33.8%) - astrocytomas, 7 (11.2%) - medulloblastomas, 9 (14.5%) - a tumor origin (craniopharyngiomas, chordoma of the skull base). MRI revealed the following characteristic features: an additional sign of the heterogeneous MRI signal of hyper and hypointensive T1 and T2 modes with a different perifocal swelling degree with involvement in the process of brain vessels. The main objectives of postoperative MRI study are the identification of early or late postoperative complications, evaluation of radical surgery, the identification of the extended-growing tumor that (in terms of 3-4 weeks). MRI performed in the following cases: 1. Suspicion of a hematoma (3 days or more) 2. Suspicion continued tumor growth (in terms of 3-4 weeks). Conclusions: Magnetic resonance tomography is a highly informative method of diagnostics of brain tumors in children. MRI also helps to determine the effectiveness and tactics of treatment and the follow up in the postoperative period.Keywords: brain tumors, children, MRI, treatment
Procedia PDF Downloads 145896 Peculiarities of Internal Friction and Shear Modulus in 60Co γ-Rays Irradiated Monocrystalline SiGe Alloys
Authors: I. Kurashvili, G. Darsavelidze, T. Kimeridze, G. Chubinidze, I. Tabatadze
Abstract:
At present, a number of modern semiconductor devices based on SiGe alloys have been created in which the latest achievements of high technologies are used. These devices might cause significant changes to networking, computing, and space technology. In the nearest future new materials based on SiGe will be able to restrict the A3B5 and Si technologies and firmly establish themselves in medium frequency electronics. Effective realization of these prospects requires the solution of prediction and controlling of structural state and dynamical physical –mechanical properties of new SiGe materials. Based on these circumstances, a complex investigation of structural defects and structural-sensitive dynamic mechanical characteristics of SiGe alloys under different external impacts (deformation, radiation, thermal cycling) acquires great importance. Internal friction (IF) and shear modulus temperature and amplitude dependences of the monocrystalline boron-doped Si1-xGex(x≤0.05) alloys grown by Czochralski technique is studied in initial and 60Co gamma-irradiated states. In the initial samples, a set of dislocation origin relaxation processes and accompanying modulus defects are revealed in a temperature interval of 400-800 ⁰C. It is shown that after gamma-irradiation intensity of relaxation internal friction in the vicinity of 280 ⁰C increases and simultaneously activation parameters of high temperature relaxation processes reveal clear rising. It is proposed that these changes of dynamical mechanical characteristics might be caused by a decrease of the dislocation mobility in the Cottrell atmosphere enriched by the radiation defects.Keywords: internal friction, shear modulus, gamma-irradiation, SiGe alloys
Procedia PDF Downloads 143895 Application Difference between Cox and Logistic Regression Models
Authors: Idrissa Kayijuka
Abstract:
The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio
Procedia PDF Downloads 454