Search results for: deep vibro techniques
6889 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation
Authors: Jonathan Gong
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning
Procedia PDF Downloads 1306888 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 1256887 Stationary Gas Turbines in Power Generation: Past, Present and Future Challenges
Authors: Michel Moliere
Abstract:
In the next decades, the thermal power generation segment will survive only if it achieves deep mutations, including drastical abatements of CO2 emissions and strong efficiency gains. In this challenging perspective, stationary gas turbines appear as serious candidates to lead the energy transition. Indeed, during the past decades, these turbomachines have made brisk technological advances in terms of efficiency, reliability, fuel flex (including the combustion of hydrogen), and the ability to hybridize with regenrables. It is, therefore, timely to summarize the progresses achieved by gas turbines in the recent past and to examine what are their assets to face the challenges of the energy transition.Keywords: energy transition, gas turbines, decarbonization, power generation
Procedia PDF Downloads 2086886 Contrast Enhancement of Color Images with Color Morphing Approach
Authors: Javed Khan, Aamir Saeed Malik, Nidal Kamel, Sarat Chandra Dass, Azura Mohd Affandi
Abstract:
Low contrast images can result from the wrong setting of image acquisition or poor illumination conditions. Such images may not be visually appealing and can be difficult for feature extraction. Contrast enhancement of color images can be useful in medical area for visual inspection. In this paper, a new technique is proposed to improve the contrast of color images. The RGB (red, green, blue) color image is transformed into normalized RGB color space. Adaptive histogram equalization technique is applied to each of the three channels of normalized RGB color space. The corresponding channels in the original image (low contrast) and that of contrast enhanced image with adaptive histogram equalization (AHE) are morphed together in proper proportions. The proposed technique is tested on seventy color images of acne patients. The results of the proposed technique are analyzed using cumulative variance and contrast improvement factor measures. The results are also compared with decorrelation stretch. Both subjective and quantitative analysis demonstrates that the proposed techniques outperform the other techniques.Keywords: contrast enhacement, normalized RGB, adaptive histogram equalization, cumulative variance.
Procedia PDF Downloads 3786885 Improvement of Bearing Capacity of Soft Clay Using Geo-Cells
Authors: Siddhartha Paul, Aman Harlalka, Ashim K. Dey
Abstract:
Soft clayey soil possesses poor bearing capacity and high compressibility because of which foundations cannot be directly placed over soft clay. Normally pile foundations are constructed to carry the load through the soft soil up to the hard stratum below. Pile construction is costly and time consuming. In order to increase the properties of soft clay, many ground improvement techniques like stone column, preloading with and without sand drains/band drains, etc. are in vogue. Time is a constraint for successful application of these improvement techniques. Another way to improve the bearing capacity of soft clay and to reduce the settlement possibility is to apply geocells below the foundation. The geocells impart rigidity to the foundation soil, reduce the net load intensity on soil and thus reduce the compressibility. A well designed geocell reinforced soil may replace the pile foundation. The present paper deals with the applicability of geocells on improvement of the bearing capacity. It is observed that a properly designed geocell may increase the bearing capacity of soft clay up to two and a half times.Keywords: bearing capacity, geo-cell, ground improvement, soft clay
Procedia PDF Downloads 3226884 Management of H. Armigera by Using Various Techniques
Authors: Ajmal Khan Kassi, Humayun Javed, Syed Abdul Qadeem
Abstract:
The study was conducted to find out the best management practices against American bollworm on Okra variety Arka Anamika during 2016. The three different management practices viz. Release of Trichogramma chilonis, hoeing and weeding, clipping and lufenuron insect growth regulator (IGR) which were tested individually and with all possible combinations for the controlling of American bollworm at 3 diverse areas viz. University Research Farm Koont, NARC and Farmer Field Taxila. All the treatment combinations regarding damage of fruit showed significant results. The minimum fruit infestation i.e. 3.20% and 3.58% was recorded with combined treatment (i.e. T. chilonis + hoeing + weeding + lufenuron) in two different localities. This combined treatment also resulted in maximum yield at NARC and Taxila i.e. 57.67 and 62.66 q/ha respectively. This treatment gave the best results to manage H. armigera. On the basis of different integrated pest management techniques, Arka Anamika variety proved to be comparatively resistant against H. armigera in different localities. So this variety is recommended for the cultivation in Pothwar region to get maximum yield.Keywords: management, american bollworm, arka anamika, okra
Procedia PDF Downloads 556883 Carbon-Based Electrochemical Detection of Pharmaceuticals from Water
Authors: M. Ardelean, F. Manea, A. Pop, J. Schoonman
Abstract:
The presence of pharmaceuticals in the environment and especially in water has gained increasing attention. They are included in emerging class of pollutants, and for most of them, legal limits have not been set-up due to their impact on human health and ecosystem was not determined and/or there is not the advanced analytical method for their quantification. In this context, the development of various advanced analytical methods for the quantification of pharmaceuticals in water is required. The electrochemical methods are known to exhibit the great potential for high-performance analytical methods but their performance is in direct relation to the electrode material and the operating techniques. In this study, two types of carbon-based electrodes materials, i.e., boron-doped diamond (BDD) and carbon nanofiber (CNF)-epoxy composite electrodes have been investigated through voltammetric techniques for the detection of naproxen in water. The comparative electrochemical behavior of naproxen (NPX) on both BDD and CNF electrodes was studied by cyclic voltammetry, and the well-defined peak corresponding to NPX oxidation was found for each electrode. NPX oxidation occurred on BDD electrode at the potential value of about +1.4 V/SCE (saturated calomel electrode) and at about +1.2 V/SCE for CNF electrode. The sensitivities for NPX detection were similar for both carbon-based electrode and thus, CNF electrode exhibited superiority in relation to the detection potential. Differential-pulsed voltammetry (DPV) and square-wave voltammetry (SWV) techniques were exploited to improve the electroanalytical performance for the NPX detection, and the best results related to the sensitivity of 9.959 µA·µM-1 were achieved using DPV. In addition, the simultaneous detection of NPX and fluoxetine -a very common antidepressive drug, also present in water, was studied using CNF electrode and very good results were obtained. The detection potential values that allowed a good separation of the detection signals together with the good sensitivities were appropriate for the simultaneous detection of both tested pharmaceuticals. These results reclaim CNF electrode as a valuable tool for the individual/simultaneous detection of pharmaceuticals in water.Keywords: boron-doped diamond electrode, carbon nanofiber-epoxy composite electrode, emerging pollutans, pharmaceuticals
Procedia PDF Downloads 2816882 Segmented Pupil Phasing with Deep Learning
Authors: Dumont Maxime, Correia Carlos, Sauvage Jean-François, Schwartz Noah, Gray Morgan
Abstract:
Context: The concept of the segmented telescope is unavoidable to build extremely large telescopes (ELT) in the quest for spatial resolution, but it also allows one to fit a large telescope within a reduced volume of space (JWST) or into an even smaller volume (Standard Cubesat). Cubesats have tight constraints on the computational burden available and the small payload volume allowed. At the same time, they undergo thermal gradients leading to large and evolving optical aberrations. The pupil segmentation comes nevertheless with an obvious difficulty: to co-phase the different segments. The CubeSat constraints prevent the use of a dedicated wavefront sensor (WFS), making the focal-plane images acquired by the science detector the most practical alternative. Yet, one of the challenges for the wavefront sensing is the non-linearity between the image intensity and the phase aberrations. Plus, for Earth observation, the object is unknown and unrepeatable. Recently, several studies have suggested Neural Networks (NN) for wavefront sensing; especially convolutional NN, which are well known for being non-linear and image-friendly problem solvers. Aims: We study in this paper the prospect of using NN to measure the phasing aberrations of a segmented pupil from the focal-plane image directly without a dedicated wavefront sensing. Methods: In our application, we take the case of a deployable telescope fitting in a CubeSat for Earth observations which triples the aperture size (compared to the 10cm CubeSat standard) and therefore triples the angular resolution capacity. In order to reach the diffraction-limited regime in the visible wavelength, typically, a wavefront error below lambda/50 is required. The telescope focal-plane detector, used for imaging, will be used as a wavefront-sensor. In this work, we study a point source, i.e. the Point Spread Function [PSF] of the optical system as an input of a VGG-net neural network, an architecture designed for image regression/classification. Results: This approach shows some promising results (about 2nm RMS, which is sub lambda/50 of residual WFE with 40-100nm RMS of input WFE) using a relatively fast computational time less than 30 ms which translates a small computation burder. These results allow one further study for higher aberrations and noise.Keywords: wavefront sensing, deep learning, deployable telescope, space telescope
Procedia PDF Downloads 1046881 Estimating Pile Toe Levels for Capacity Assessment of Piers and Wharves in the Philippines
Authors: Ailvy Faith Zamora, Serj Donn David, Michael Anderson
Abstract:
There are a number of decades-old piers and wharves in Manila, Philippines, that are currently being used for container and bulk cargo handling port operations. These structures fulfill a very important role in the economy and hence have undergone rehabilitation and assessment of capacity to accommodate current and future operational requirements. The capacity assessment would include structural and pile geotechnical evaluation. Unfortunately, old marine structures in the Philippines may not have a complete set of as-built information. In certain instances, critical information, such as pile toe levels, is missing in the documentation. A combination of direct tests, geophysical tests, and numerical analysis/modelling has been performed to estimate existing pile toe levels of open-type piers and anchored quay wall wharves in Manila. These techniques were applied to both concrete and steel piles. This paper presents the tools utilized, testing setup, and techniques used for estimating toe levels of existing piles for certain structures, including the challenges encountered and applied solutions.Keywords: geophysical testing, pile toe level, structural assessment, piers, wharves
Procedia PDF Downloads 1296880 A Study of Using Different Printed Circuit Board Design Methods on Ethernet Signals
Authors: Bahattin Kanal, Nursel Akçam
Abstract:
Data transmission size and frequency requirements are increasing rapidly in electronic communication protocols. Increasing data transmission speeds have made the design of printed circuit boards much more important. It is important to carefully examine the requirements and make analyses before and after the design of the digital electronic circuit board. It delves into impedance matching techniques, signal trace routing considerations, and the impact of layer stacking on signal performance. The paper extensively explores techniques for minimizing crosstalk issues and interference, presenting a holistic perspective on design strategies to optimize the quality of high-speed signals. Through a comprehensive review of these design methodologies, this study aims to provide insights into achieving reliable and high-performance printed circuit board layouts for these signals. In this study, the effect of different design methods on Ethernet signals was examined from the type of S parameters. Siemens company HyperLynx software tool was used for the analyses.Keywords: HyperLynx, printed circuit board, s parameters, ethernet
Procedia PDF Downloads 346879 Insights into Archaeological Human Sample Microbiome Using 16S rRNA Gene Sequencing
Authors: Alisa Kazarina, Guntis Gerhards, Elina Petersone-Gordina, Ilva Pole, Viktorija Igumnova, Janis Kimsis, Valentina Capligina, Renate Ranka
Abstract:
Human body is inhabited by a vast number of microorganisms, collectively known as the human microbiome, and there is a tremendous interest in evolutionary changes in human microbial ecology, diversity and function. The field of paleomicrobiology, study of ancient human microbiome, is powered by modern techniques of Next Generation Sequencing (NGS), which allows extracting microbial genomic data directly from archaeological sample of interest. One of the major techniques is 16S rRNA gene sequencing, by which certain 16S rRNA gene hypervariable regions are being amplified and sequenced. However, some limitations of this method exist including the taxonomic precision and efficacy of different regions used. The aim of this study was to evaluate the phylogenetic sensitivity of different 16S rRNA gene hypervariable regions for microbiome studies in the archaeological samples. Towards this aim, archaeological bone samples and corresponding soil samples from each burial environment were collected in Medieval cemeteries in Latvia. The Ion 16S™ Metagenomics Kit targeting different 16S rRNA gene hypervariable regions was used for library construction (Ion Torrent technologies). Sequenced data were analysed by using appropriate bioinformatic techniques; alignment and taxonomic representation was done using Mothur program. Sequences of most abundant genus were further aligned to E. coli 16S rRNA gene reference sequence using MEGA7 in order to identify the hypervariable region of the segment of interest. Our results showed that different hypervariable regions had different discriminatory power depending on the groups of microbes, as well as the nature of samples. On the basis of our results, we suggest that wider range of primers used can provide more accurate recapitulation of microbial communities in archaeological samples. Acknowledgements. This work was supported by the ERAF grant Nr. 1.1.1.1/16/A/101.Keywords: 16S rRNA gene, ancient human microbiome, archaeology, bioinformatics, genomics, microbiome, molecular biology, next-generation sequencing
Procedia PDF Downloads 1906878 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing
Authors: Jaimin Patel
Abstract:
Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man in middle attack
Procedia PDF Downloads 2806877 Enhancing Sensitivity in Multifrequency Atomic Force Microscopy
Authors: Babak Eslami
Abstract:
Bimodal and trimodal AFM have provided additional capabilities to scanning probe microscopy characterization techniques. These capabilities have specifically enhanced material characterization of surfaces and provided subsurface imaging in addition to conventional topography images. Bimodal and trimodal AFM, being different techniques of multifrequency AFM, are based on exciting the cantilever’s fundamental eigenmode with second and third eigenmodes simultaneously. Although higher eigenmodes provide a higher number of observables that can provide additional information about the sample, they cause experimental challenges. In this work, different experimental approaches for enhancing AFM images in multifrequency for different characterization goals are provided. The trade-offs between eigenmodes including the advantages and disadvantages of using each mode for different samples (ranging from stiff to soft matter) in both air and liquid environments are provided. Additionally, the advantage of performing conventional single tapping mode AFM with higher eigenmodes of the cantilever in order to reduce sample indentation is discussed. These analyses are performed on widely used polymers such as polystyrene, polymethyl methacrylate and air nanobubbles on different surfaces in both air and liquid.Keywords: multifrequency, sensitivity, soft matter, polymer
Procedia PDF Downloads 1346876 Prediction of Sepsis Illness from Patients Vital Signs Using Long Short-Term Memory Network and Dynamic Analysis
Authors: Marcio Freire Cruz, Naoaki Ono, Shigehiko Kanaya, Carlos Arthur Mattos Teixeira Cavalcante
Abstract:
The systems that record patient care information, known as Electronic Medical Record (EMR) and those that monitor vital signs of patients, such as heart rate, body temperature, and blood pressure have been extremely valuable for the effectiveness of the patient’s treatment. Several kinds of research have been using data from EMRs and vital signs of patients to predict illnesses. Among them, we highlight those that intend to predict, classify, or, at least identify patterns, of sepsis illness in patients under vital signs monitoring. Sepsis is an organic dysfunction caused by a dysregulated patient's response to an infection that affects millions of people worldwide. Early detection of sepsis is expected to provide a significant improvement in its treatment. Preceding works usually combined medical, statistical, mathematical and computational models to develop detection methods for early prediction, getting higher accuracies, and using the smallest number of variables. Among other techniques, we could find researches using survival analysis, specialist systems, machine learning and deep learning that reached great results. In our research, patients are modeled as points moving each hour in an n-dimensional space where n is the number of vital signs (variables). These points can reach a sepsis target point after some time. For now, the sepsis target point was calculated using the median of all patients’ variables on the sepsis onset. From these points, we calculate for each hour the position vector, the first derivative (velocity vector) and the second derivative (acceleration vector) of the variables to evaluate their behavior. And we construct a prediction model based on a Long Short-Term Memory (LSTM) Network, including these derivatives as explanatory variables. The accuracy of the prediction 6 hours before the time of sepsis, considering only the vital signs reached 83.24% and by including the vectors position, speed, and acceleration, we obtained 94.96%. The data are being collected from Medical Information Mart for Intensive Care (MIMIC) Database, a public database that contains vital signs, laboratory test results, observations, notes, and so on, from more than 60.000 patients.Keywords: dynamic analysis, long short-term memory, prediction, sepsis
Procedia PDF Downloads 1256875 Utility of Executive Function Training in Typically Developing Adolescents and Special Populations: A Systematic Review of the Literature
Authors: Emily C. Shepard, Caroline Sweeney, Jessica Grimm, Sophie Jacobs, Lauren Thompson, Lisa L. Weyandt
Abstract:
Adolescence is a critical phase of development in which individuals are prone to more risky behavior while also facing potentially life-changing decisions. The balance of increased behavioral risk and responsibility indicates the importance of executive functioning ability. In recent years, executive function training has emerged as a technique to enhance this cognitive ability. The aim of the present systematic review was to discuss the reported efficacy of executive functioning training techniques among adolescents. After reviewing 3110 articles, a total of 24 articles were identified which examined the role of executive functioning training techniques among adolescents (age 10-19). Articles retrieved demonstrated points of comparison across psychiatric and medical diagnosis, location of training, and stage of adolescence. Typically developing samples, as well as those with attention-deficit hyperactivity disorder (ADHD), autism spectrum disorder (ASD), conduct disorder, and physical health concerns were found, allowing for the comparison of the efficacy of techniques considering physical and psychological heterogeneity. Among typically developing adolescents, executive functioning training yielded nonsignificant or low effect size improvements in executive functioning, and in some cases executive functioning ability was decreased following the training. In special populations, including those with ADHD, (ASD), conduct disorder, and physical health concerns significant differences and larger effect sizes in executive functioning were seen following treatment, particularly among individuals with ADHD. Future research is needed to identify the long-term efficacy of these treatments, as well as their generalizability to real-world conditions.Keywords: adolescence, attention-deficit hyperactivity disorder, executive function, executive function training, traumatic brain injury
Procedia PDF Downloads 1906874 Sustainable User Comfort Using Building Envelope Design; From Traditional Methods to Innovative Solutions
Authors: Soufi Saylam
Abstract:
Environmental concerns, rising consumption of energy, and the high cost of mechanical systems have all contributed to increased interest in building energy efficiency and passive thermal design in recent years. This study attempts to make an evaluation of building envelope components and associated retrofits in terms of their impact on energy efficiency and occupant comfort in a sustainable context. The design of the building envelope, as a critical component of the building, has a significant impact on the organization of interior space and user comfort. In this regard, in order to achieve maximum comfort and energy savings, the design of the building envelope should include a thermal comfort system that adapts to climatic variables. This system should be developed in harmony with the environmental features, building shape, and materials used. The aim of this study is to investigate the role of the building envelope in sustainable architecture by integrating traditional envelope design principles and strategies with technological techniques, as well as to examine its role in providing physical and psychological comfort to users in the interior space.Keywords: envelope design, functional needs, physiological comfort, sustainable architecture, traditional techniques
Procedia PDF Downloads 66873 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 636872 Evaluation of Three Digital Graphical Methods of Baseflow Separation Techniques in the Tekeze Water Basin in Ethiopia
Authors: Alebachew Halefom, Navsal Kumar, Arunava Poddar
Abstract:
The purpose of this work is to specify the parameter values, the base flow index (BFI), and to rank the methods that should be used for base flow separation. Three different digital graphical approaches are chosen and used in this study for the purpose of comparison. The daily time series discharge data were collected from the site for a period of 30 years (1986 up to 2015) and were used to evaluate the algorithms. In order to separate the base flow and the surface runoff, daily recorded streamflow (m³/s) data were used to calibrate procedures and get parameter values for the basin. Additionally, the performance of the model was assessed by the use of the standard error (SE), the coefficient of determination (R²), and the flow duration curve (FDC) and baseflow indexes. The findings indicate that, in general, each strategy can be used worldwide to differentiate base flow; however, the Sliding Interval Method (SIM) performs significantly better than the other two techniques in this basin. The average base flow index was calculated to be 0.72 using the local minimum method, 0.76 using the fixed interval method, and 0.78 using the sliding interval method, respectively.Keywords: baseflow index, digital graphical methods, streamflow, Emba Madre Watershed
Procedia PDF Downloads 796871 Impact of Different Modulation Techniques on the Performance of Free-Space Optics
Authors: Naman Singla, Ajay Pal Singh Chauhan
Abstract:
As the demand for providing high bit rate and high bandwidth is increasing at a rapid rate so there is a need to see in this problem and finds a technology that provides high bit rate and also high bandwidth. One possible solution is by use of optical fiber. Optical fiber technology provides high bandwidth in THz. But the disadvantage of optical fiber is of high cost and not used everywhere because it is not possible to reach all the locations on the earth. Also high maintenance required for usage of optical fiber. It puts a lot of cost. Another technology which is almost similar to optical fiber is Free Space Optics (FSO) technology. FSO is the line of sight technology where modulated optical beam whether infrared or visible is used to transfer information from one point to another through the atmosphere which works as a channel. This paper concentrates on analyzing the performance of FSO in terms of bit error rate (BER) and quality factor (Q) using different modulation techniques like non return to zero on off keying (NRZ-OOK), differential phase shift keying (DPSK) and differential quadrature phase shift keying (DQPSK) using OptiSystem software. The findings of this paper show that FSO system based on DQPSK modulation technique performs better.Keywords: attenuation, bit rate, free space optics, link length
Procedia PDF Downloads 3476870 Multibody Constrained Dynamics of Y-Method Installation System for a Large Scale Subsea Equipment
Authors: Naeem Ullah, Menglan Duan, Mac Darlington Uche Onuoha
Abstract:
The lowering of subsea equipment into the deep waters is a challenging job due to the harsh offshore environment. Many researchers have introduced various installation systems to deploy the payload safely into the deep oceans. In general practice, dual floating vessels are not employed owing to the prevalent safety risks and hazards caused by ever-increasing dynamical effects sourced by mutual interaction between the bodies. However, while keeping in the view of the optimal grounds, such as economical one, the Y-method, the two conventional tugboats supporting the equipment by the two independent strands connected to a tri-plate above the equipment, has been employed to study multibody dynamics of the dual barge lifting operations. In this study, the two tugboats and the suspended payload (Y-method) are deployed for the lowering of subsea equipment into the deep waters as a multibody dynamic system. The two-wire ropes are used for the lifting and installation operation by this Y-method installation system. 6-dof (degree of freedom) for each body are considered to establish coupled 18-dof multibody model by embedding technique or velocity transformation technique. The fundamental and prompt advantage of this technique is that the constraint forces can be eliminated directly, and no extra computational effort is required for the elimination of the constraint forces. The inertial frame of reference is taken at the surface of the water as the time-independent frame of reference, and the floating frames of reference are introduced in each body as the time-dependent frames of reference in order to formulate the velocity transformation matrix. The local transformation of the generalized coordinates to the inertial frame of reference is executed by applying the Euler Angle approach. The spherical joints are articulated amongst the multibody as the kinematic joints. The hydrodynamic force, the two-strand forces, the hydrostatic force, and the mooring forces are taken into consideration as the external forces. The radiation force of the hydrodynamic force is obtained by employing the Cummins equation. The wave exciting part of the hydrodynamic force is obtained by using force response amplitude operators (RAOs) that are obtained by the commercial solver ‘OpenFOAM’. The strand force is obtained by considering the wire rope as an elastic spring. The nonlinear hydrostatic force is obtained by the pressure integration technique at each time step of the wave movement. The mooring forces are evaluated by using Faltinsen analytical approach. ‘The Runge Kutta Method’ of Fourth-Order is employed to evaluate the coupled equations of motion obtained for 18-dof multibody model. The results are correlated with the simulated Orcaflex Model. Moreover, the results from Orcaflex Model are compared with the MOSES Model from previous studies. The MBDS of single barge lifting operation from the former studies are compared with the MBDS of the established dual barge lifting operation. The dynamics of the dual barge lifting operation are found larger in magnitude as compared to the single barge lifting operation. It is noticed that the traction at the top connection point of the cable decreases with the increase in the length, and it becomes almost constant after passing through the splash zone.Keywords: dual barge lifting operation, Y-method, multibody dynamics, shipbuilding, installation of subsea equipment, shipbuilding
Procedia PDF Downloads 2036869 Delineation of the Geoelectric and Geovelocity Parameters in the Basement Complex of Northwestern Nigeria
Authors: M. D. Dogara, G. C. Afuwai, O. O. Esther, A. M. Dawai
Abstract:
The geology of Northern Nigeria is under intense investigation particularly that of the northwest believed to be of the basement complex. The variability of the lithology is consistently inconsistent. Hence, the need for a close range study, it is, in view of the above that, two geophysical techniques, the vertical electrical sounding employing the Schlumberger array and seismic refraction methods, were used to delineate the geoelectric and geovelocity parameters of the basement complex of northwestern Nigeria. A total area of 400,000 m² was covered with sixty geoelectric stations established and sixty sets of seismic refraction data collected using the forward and reverse method. From the interpretation of the resistivity data, it is suggestive that the area is underlain by not more than five geoelectric layers of varying thicknesses and resistivities when a maximum half electrode spread of 100m was used. The result of the interpreted seismic data revealed two geovelocity layers, with velocities ranging between 478m/s to 1666m/s for the first layer and 1166m/s to 7141m/s for the second layer. The results of the two techniques, suggests that the area of study has an undulating bedrock topography with geoeletric and geovelocity layers composed of weathered rock materials.Keywords: basement complex, delineation, geoelectric, geovelocity, Nigeria
Procedia PDF Downloads 1516868 Study of Objectivity, Reliability and Validity of Pedagogical Diagnostic Parameters Introduced in the Framework of a Specific Research
Authors: Emiliya Tsankova, Genoveva Zlateva, Violeta Kostadinova
Abstract:
The challenges modern education faces undoubtedly require reforms and innovations aimed at the reconceptualization of existing educational strategies, the introduction of new concepts and novel techniques and technologies related to the recasting of the aims of education and the remodeling of the content and methodology of education which would guarantee the streamlining of our education with basic European values. Aim: The aim of the current research is the development of a didactic technology for the assessment of the applicability and efficacy of game techniques in pedagogic practice calibrated to specific content and the age specificity of learners, as well as for evaluating the efficacy of such approaches for the facilitation of the acquisition of biological knowledge at a higher theoretical level. Results: In this research, we examine the objectivity, reliability and validity of two newly introduced diagnostic parameters for assessing the durability of the acquired knowledge. A pedagogic experiment has been carried out targeting the verification of the hypothesis that the introduction of game techniques in biological education leads to an increase in the quantity, quality and durability of the knowledge acquired by students. For the purposes of monitoring the effect of the application of the pedagogical technique employing game methodology on the durability of the acquired knowledge a test-base examination has been applied to students from a control group (CG) and students form an experimental group on the same content after a six-month period. The analysis is based on: 1.A study of the statistical significance of the differences of the tests for the CG and the EG, applied after a six-month period, which however is not indicative of the presence or absence of a marked effect from the applied pedagogic technique in cases when the entry levels of the two groups are different. 2.For a more reliable comparison, independently from the entry level of each group, another “indicator of efficacy of game techniques for the durability of knowledge” which has been used for the assessment of the achievement results and durability of this methodology of education. The monitoring of the studied parameters in their dynamic unfolding in different age groups of learners unquestionably reveals a positive effect of the introduction of game techniques in education in respect of durability and permanence of acquired knowledge. Methods: In the current research the following battery of methods and techniques of research for diagnostics has been employed: theoretical analysis and synthesis; an actual pedagogical experiment; questionnaire; didactic testing and mathematical and statistical methods. The data obtained have been used for the qualitative and quantitative of the results which reflect the efficacy of the applied methodology. Conclusion: The didactic model of the parameters researched in the framework of a specific study of pedagogic diagnostics is based on a general, interdisciplinary approach. Enhanced durability of the acquired knowledge proves the transition of that knowledge from short-term memory storage into long-term memory of pupils and students, which justifies the conclusion that didactic plays have beneficial effects for the betterment of learners’ cognitive skills. The innovations in teaching enhance the motivation, creativity and independent cognitive activity in the process of acquiring the material thought. The innovative methods allow for untraditional means for assessing the level of knowledge acquisition. This makes possible the timely discovery of knowledge gaps and the introduction of compensatory techniques, which in turn leads to deeper and more durable acquisition of knowledge.Keywords: objectivity, reliability and validity of pedagogical diagnostic parameters introduced in the framework of a specific research
Procedia PDF Downloads 3936867 The Effect of Addition of Dioctyl Terephthalate and Calcite on the Tensile Properties of Organoclay/Linear Low Density Polyethylene Nanocomposites
Authors: A. Gürses, Z. Eroğlu, E. Şahin, K. Güneş, Ç. Doğar
Abstract:
In recent years, polymer/clay nanocomposites have generated great interest in the polymer industry as a new type of composite material because of their superior properties, which includes high heat deflection temperature, gas barrier performance, dimensional stability, enhanced mechanical properties, optical clarity and flame retardancy when compared with the pure polymer or conventional composites. The investigation of change of the tensile properties of organoclay/linear low density polyethylene (LLDPE) nanocomposites with the use of Dioctyl terephthalate (DOTP) (as plasticizer) and calcite (as filler) has been aimed. The composites and organoclay synthesized were characterized using the techniques such as XRD, HRTEM and FTIR techniques. The spectroscopic results indicate that platelets of organoclay were well dispersed within the polymeric matrix. The tensile properties of the composites were compared considering the stress-strain curve drawn for each composite and pure polymer. It was observed that the composites prepared by adding the plasticizer at different ratios and a certain amount of calcite exhibited different tensile behaviors compared to pure polymer.Keywords: linear low density polyethylene, nanocomposite, organoclay, plasticizer
Procedia PDF Downloads 2936866 Optimization of Acid Treatments by Assessing Diversion Strategies in Carbonate and Sandstone Formations
Authors: Ragi Poyyara, Vijaya Patnana, Mohammed Alam
Abstract:
When acid is pumped into damaged reservoirs for damage removal/stimulation, distorted inflow of acid into the formation occurs caused by acid preferentially traveling into highly permeable regions over low permeable regions, or (in general) into the path of least resistance. This can lead to poor zonal coverage and hence warrants diversion to carry out an effective placement of acid. Diversion is desirably a reversible technique of temporarily reducing the permeability of high perm zones, thereby forcing the acid into lower perm zones. The uniqueness of each reservoir can pose several challenges to engineers attempting to devise optimum and effective diversion strategies. Diversion techniques include mechanical placement and/or chemical diversion of treatment fluids, further sub-classified into ball sealers, bridge plugs, packers, particulate diverters, viscous gels, crosslinked gels, relative permeability modifiers (RPMs), foams, and/or the use of placement techniques, such as coiled tubing (CT) and the maximum pressure difference and injection rate (MAPDIR) methodology. It is not always realized that the effectiveness of diverters greatly depends on reservoir properties, such as formation type, temperature, reservoir permeability, heterogeneity, and physical well characteristics (e.g., completion type, well deviation, length of treatment interval, multiple intervals, etc.). This paper reviews the mechanisms by which each variety of diverter functions and discusses the effect of various reservoir properties on the efficiency of diversion techniques. Guidelines are recommended to help enhance productivity from zones of interest by choosing the best methods of diversion while pumping an optimized amount of treatment fluid. The success of an overall acid treatment often depends on the effectiveness of the diverting agents.Keywords: diversion, reservoir, zonal coverage, carbonate, sandstone
Procedia PDF Downloads 4326865 Synthesis, Characterization, Antioxidant and Anti-inflammatory Studies of Modern Synthetic Tetra Phenyl Porphyrin Derivatives
Authors: Mian Gul Sayed, Rahim Shah, Fazal Mabood, Najeeb Ur Rahman, Maher Noor
Abstract:
Embarking on the frontier of molecular advancement, this study focuses on the synthesis and characterization of a distinct class of porphyrin derivatives—specifically, the 5, 10, 15, 20-tetrakis (3-bromopropoxyphenyl) porphyrins. Through meticulous synthetic methodologies, these derivatives are crafted, strategically incorporating bromopropoxyphenyl moieties at distinct positions within the porphyrin framework. This research aims to unravel the structural intricacies and explore the potential applications of these compounds through a detailed characterization utilizing advanced analytical techniques. 5, 10, 15, 20, tetrakis (4-hydroxyphenyl) porphyrin was synthesized by treating pyrrole and p- hydroxylbenzaldehyde. 5, 10, 15, 20, tetrakis-(4-hydroxyphenyl) was converted into 5, 10, 15, 20, tetrakis (4-bromoalkoxyphenyl) porphyrin. 5,10,15, 20-Tetrakis -(4-bromoalkoxyphenyl) porphyrin was treated with Isopropyl phenol, para-Aminophenol, hydroquinone, 2-Naphthol, 1-Naphthol and Hydroquinone and different derivatives of ether-linked were obtained. The synthesized compounds were analyzed using contemporary spectroscopic techniques like UV-Vis, NMR and Mass spectrometry. The synthesized compounds were also tested for their biological activities like antioxidants and anti-inflammatory.Keywords: tetraphenyl porphyrin, NMR, antioxidant, anti-inflammatory
Procedia PDF Downloads 166864 Discussion of Blackness in Wrestling
Authors: Jason Michael Crozier
Abstract:
The wrestling territories of the mid-twentieth century in the United States are widely considered the birthplace of modern professional wrestling, and by many professional wrestlers, to be a beacon of hope for the easing of racial tensions during the civil rights era and beyond. The performers writing on this period speak of racial equality but fail to acknowledge the exploitation of black athletes as a racialized capital commodity who suffered the challenges of systemic racism, codified by a false narrative of aspirational exceptionalism and equality measured by audience diversity. The promoters’ ability to equate racial and capital exploitation with equality leads to a broader discussion of the history of Muscular Christianity in the United States and the exploitation of black bodies. Narratives of racial erasure that dominate the historical discourse when examining athleticism and exceptionalism redefined how blackness existed and how physicality and race are conceived of in sport and entertainment spaces. When discussing the implications of race and professional wrestling, it is important to examine the role of promotions as ‘imagined communities’ where the social agency of wrestlers is defined and quantified based on their ‘desired elements’ as a performer. The intentionally vague nature of this language masks a deep history of racialization that has been perpetuated by promoters and never fully examined by scholars. Sympathetic racism and the omission of cultural identity are also key factors in the limitations and racial barriers placed upon black athletes in the squared circle. The use of sympathetic racism within professional wrestling during the twentieth century defined black athletes into two distinct categorizations, the ‘black savage’ or the ‘black minstrel’. Black wrestlers of the twentieth century were defined by their strength as a capital commodity and their physicality rather than their knowledge of the business and in-ring skill. These performers had little agency in their ability to shape their own character development inside and outside the ring. Promoters would often create personas that heavily racialized the performer by tying them to a regional past or memory, such as that of slavery in the deep south using dog collar matches and adoring black characters in chains. Promoters softened cultural memory by satirizing the historic legacy of slavery and the black identity.Keywords: sympathetic racism, social agency, racial commodification, stereotyping
Procedia PDF Downloads 1356863 Multi Biomertric Personal Identification System Based On Hybird Intellegence Method
Authors: Laheeb M. Ibrahim, Ibrahim A. Salih
Abstract:
Biometrics is a technology that has been widely used in many official and commercial identification applications. The increased concerns in security during recent years (especially during the last decades) have essentially resulted in more attention being given to biometric-based verification techniques. Here, a novel fusion approach of palmprint, dental traits has been suggested. These traits which are authentication techniques have been employed in a range of biometric applications that can identify any postmortem PM person and antemortem AM. Besides improving the accuracy, the fusion of biometrics has several advantages such as increasing, deterring spoofing activities and reducing enrolment failure. In this paper, a first unimodel biometric system has been made by using (palmprint and dental) traits, for each one classification applying an artificial neural network and a hybrid technique that combines swarm intelligence and neural network together, then attempt has been made to combine palmprint and dental biometrics. Principally, the fusion of palmprint and dental biometrics and their potential application has been explored as biometric identifiers. To address this issue, investigations have been carried out about the relative performance of several statistical data fusion techniques for integrating the information in both unimodal and multimodal biometrics. Also the results of the multimodal approach have been compared with each one of these two traits authentication approaches. This paper studies the features and decision fusion levels in multimodal biometrics. To determine the accuracy of GAR to parallel system decision-fusion including (AND, OR, Majority fating) has been used. The backpropagation method has been used for classification and has come out with result (92%, 99%, 97%) respectively for GAR, while the GAR) for this algorithm using hybrid technique for classification (95%, 99%, 98%) respectively. To determine the accuracy of the multibiometric system for feature level fusion has been used, while the same preceding methods have been used for classification. The results have been (98%, 99%) respectively while to determine the GAR of feature level different methods have been used and have come out with (98%).Keywords: back propagation neural network BP ANN, multibiometric system, parallel system decision-fusion, practical swarm intelligent PSO
Procedia PDF Downloads 5336862 An Advanced Approach to Detect and Enumerate Soil-Transmitted Helminth Ova from Wastewater
Authors: Vivek B. Ravindran, Aravind Surapaneni, Rebecca Traub, Sarvesh K. Soni, Andrew S. Ball
Abstract:
Parasitic diseases have a devastating, long-term impact on human health and welfare. More than two billion people are infected with soil-transmitted helminths (STHs), including the roundworms (Ascaris), hookworms (Necator and Ancylostoma) and whipworm (Trichuris) with majority occurring in the tropical and subtropical regions of the world. Despite its low prevalence in developed countries, the removal of STHs from wastewater remains crucial to allow the safe use of sludge or recycled water in agriculture. Conventional methods such as incubation and optical microscopy are cumbersome; consequently, the results drastically vary from person-to-person observing the ova (eggs) under microscope. Although PCR-based methods are an alternative to conventional techniques, it lacks the ability to distinguish between viable and non-viable helminth ova. As a result, wastewater treatment industries are in major need for radically new and innovative tools to detect and quantify STHs eggs with precision, accuracy and being cost-effective. In our study, we focus on the following novel and innovative techniques: -Recombinase polymerase amplification and Surface enhanced Raman spectroscopy (RPA-SERS) based detection of helminth ova. -Use of metal nanoparticles and their relative nanozyme activity. -Colorimetric detection, differentiation and enumeration of genera of helminth ova using hydrolytic enzymes (chitinase and lipase). -Propidium monoazide (PMA)-qPCR to detect viable helminth ova. -Modified assay to recover and enumerate helminth eggs from fresh raw sewage. -Transcriptome analysis of ascaris ova in fresh raw sewage. The aforementioned techniques have the potential to replace current conventional and molecular methods thereby producing a standard protocol for the determination and enumeration of helminth ova in sewage sludge.Keywords: colorimetry, helminth, PMA-QPCR, nanoparticles, RPA, viable
Procedia PDF Downloads 2996861 A Survey on Intelligent Connected-Vehicle Applications Based on Intercommunication Techniques in Smart Cities
Authors: B. Karabuluter, O. Karaduman
Abstract:
Connected-Vehicles consists of intelligent vehicles, each of which can communicate with each other. Smart Cities are the most prominent application area of intelligent vehicles that can communicate with each other. The most important goal that is desired to be realized in Smart Cities planned for facilitating people's lives is to make transportation more comfortable and safe with intelligent/autonomous/driverless vehicles communicating with each other. In order to ensure these, the city must have communication infrastructure in the first place, and the vehicles must have the features to communicate with this infrastructure and with each other. In this context, intelligent transport studies to solve all transportation and traffic problems in classical cities continue to increase rapidly. In this study, current connected-vehicle applications developed for smart cities are considered in terms of communication techniques, vehicular networking, IoT, urban transportation implementations, intelligent traffic management, road safety, self driving. Taxonomies and assessments performed in the work show the trend of studies in inter-vehicle communication systems in smart cities and they are contributing to by ensuring that the requirements in this area are revealed.Keywords: smart city, connected vehicles, infrastructures, VANET, wireless communication, intelligent traffic management
Procedia PDF Downloads 5266860 Savi Scout versus Wire-Guided Localization in Non-palpable Breast Lesions – Comparison of Breast Tissue Volume and Weight and Excision Safety Margin
Authors: Walid Ibrahim, Abdul Kasem, Sudeendra Doddi, Ilaria Giono, Tareq Sabagh, Muhammad Ammar, Nermin Osman
Abstract:
Background: wire-guided localization (WL) is the most widely used method for the localization of non-palpable breast lesions. SAVI SCOUT occult lesion localization (SSL) is a new technique in breast-conservative surgery. SSL has the potential benefit of improving radiology workflow as well as accurate localization. Purpose: The purpose of this study is to compare the breast tissue specimen volume and weight and margin excision between WL and SSL. Materials and methods: A single institution retrospective analysis of 377 female patients who underwent wide local breast excision with SAVI SCOUT and or wire-guided technique between 2018 and 2021 in a UK University teaching hospital. Breast department. Breast tissue specimen volume and weight, and margin excision have been evaluated in the three groups of different localization. Results: Three hundred and seventy-seven patients were studied. Of these, 261 had wire localization, 88 had SCOUT and 28 had dual localization techniques. Tumor size ranged from 1 to 75mm (Median 20mm). The pathology specimen weight ranged from 1 to 466gm (Median 46.8) and the volume ranged from 1.305 to 1560cm³ (Median 106.32 cm³). SCOUT localization was associated with a significantly low specimen weight than wire or the dual technique localization (Median 41gm vs 47.3gm and 47gm, p = 0.029). SCOUT was not associated with better specimen volume with a borderline significance in comparison to wire and combined techniques (Median 108cm³ vs 105cm³ and 105cm³, p = 0.047). There was a significant correlation between tumor size and pathology specimen weight in the three groups. SCOUT showed a better >2mm safety margin in comparison to the other 2 techniques (p = 0.031). Conclusion: Preoperative SCOUT localization is associated with better specimen weight and better specimen margin. SCOUT did not show any benefits in terms of specimen volume which may be due to difficulty in getting the accurate specimen volume due to the irregularity of the soft tissue specimen.Keywords: scout, wire, localization, breast
Procedia PDF Downloads 110