Search results for: genetic algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4715

Search results for: genetic algorithm

965 Evaluation of Radio Protective Potential of Indian Bamboo Leaves

Authors: Mansi Patel, Priti Mehta

Abstract:

Background: Ionizing radiations have detrimental effects on humans, and the growing technological encroachment has increased human exposure to it enormously. So, the safety issues have emphasized researchers to develop radioprotector from natural resources having minimal toxicity. A substance having anti-inflammatory, antioxidant, and immunomodulatory activity can be a potential candidate for radioprotection. One such plant with immense potential i.e. Bamboo was selected for the present study. Purpose: The study aims to evaluate the potential of Indian bamboo leaves for protection against the clastogenic effect of gamma radiation. Methods: The protective effect of bamboo leaf extract against gamma radiation-induced genetic damage in human peripheral blood lymphocytes (HPBLs) was evaluated in vitro using Cytokinesis blocked micronuclei assay (CBMN). The blood samples were pretreated with varying concentration of extract 30 min before the radiation exposure (4Gy & 6Gy). The reduction in the frequency of micronuclei was observed for the irradiated and control groups. The effect of various concentration of bamboo leaf extract (400,600,800 mg/kg) on the development of radiation induced sickness and altered mortality in mice exposed to 8 Gy of whole-body gamma radiation was studied. The developed symptoms were clinically scored by multiple endpoints for 30 days. Results: Treatment of HPBLs with varying concentration of extract before exposure to a different dose of γ- radiation resulted in significant (P < 0.0001) decline of radiation induced micronuclei. It showed dose dependent and concentration driven activity. The maximum protection ~ 70% was achieved at nine µg/ml concentration. Extract treated whole body irradiated mice showed 50%, 83.3% and 100% survival for 400, 600, and 800mg/kg with 1.05, 0.43 and 0 clinical score respectively when compared to Irradiated mice having 6.03 clinical score and 0% survival. Conclusion: Our findings indicate bamboo leaf extract reduced the radiation induced cytogenetic damage. It has also increased the survival ratio and reduced the radiation induced sickness and mortality when exposed to a lethal dose of gamma radiation.

Keywords: bamboo leaf extract, Cytokinesis blocked micronuclei (CBMN) assay, ionizing radiation, radio protector

Procedia PDF Downloads 142
964 Identifying the Structural Components of Old Buildings from Floor Plans

Authors: Shi-Yu Xu

Abstract:

The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.

Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence

Procedia PDF Downloads 81
963 IEEE802.15.4e Based Scheduling Mechanisms and Systems for Industrial Internet of Things

Authors: Ho-Ting Wu, Kai-Wei Ke, Bo-Yu Huang, Liang-Lin Yan, Chun-Ting Lin

Abstract:

With the advances in advanced technology, wireless sensor network (WSN) has become one of the most promising candidates to implement the wireless industrial internet of things (IIOT) architecture. However, the legacy IEEE 802.15.4 based WSN technology such as Zigbee system cannot meet the stringent QoS requirement of low powered, real-time, and highly reliable transmission imposed by the IIOT environment. Recently, the IEEE society developed IEEE 802.15.4e Time Slotted Channel Hopping (TSCH) access mode to serve this purpose. Furthermore, the IETF 6TiSCH working group has proposed standards to integrate IEEE 802.15.4e with IPv6 protocol smoothly to form a complete protocol stack for IIOT. In this work, we develop key network technologies for IEEE 802.15.4e based wireless IIoT architecture, focusing on practical design and system implementation. We realize the OpenWSN-based wireless IIOT system. The system architecture is divided into three main parts: web server, network manager, and sensor nodes. The web server provides user interface, allowing the user to view the status of sensor nodes and instruct sensor nodes to follow commands via user-friendly browser. The network manager is responsible for the establishment, maintenance, and management of scheduling and topology information. It executes centralized scheduling algorithm, sends the scheduling table to each node, as well as manages the sensing tasks of each device. Sensor nodes complete the assigned tasks and sends the sensed data. Furthermore, to prevent scheduling error due to packet loss, a schedule inspection mechanism is implemented to verify the correctness of the schedule table. In addition, when network topology changes, the system will act to generate a new schedule table based on the changed topology for ensuring the proper operation of the system. To enhance the system performance of such system, we further propose dynamic bandwidth allocation and distributed scheduling mechanisms. The developed distributed scheduling mechanism enables each individual sensor node to build, maintain and manage the dedicated link bandwidth with its parent and children nodes based on locally observed information by exchanging the Add/Delete commands via two processes. The first process, termed as the schedule initialization process, allows each sensor node pair to identify the available idle slots to allocate the basic dedicated transmission bandwidth. The second process, termed as the schedule adjustment process, enables each sensor node pair to adjust their allocated bandwidth dynamically according to the measured traffic loading. Such technology can sufficiently satisfy the dynamic bandwidth requirement in the frequently changing environments. Last but not least, we propose a packet retransmission scheme to enhance the system performance of the centralized scheduling algorithm when the packet delivery rate (PDR) is low. We propose a multi-frame retransmission mechanism to allow every single network node to resend each packet for at least the predefined number of times. The multi frame architecture is built according to the number of layers of the network topology. Performance results via simulation reveal that such retransmission scheme is able to provide sufficient high transmission reliability while maintaining low packet transmission latency. Therefore, the QoS requirement of IIoT can be achieved.

Keywords: IEEE 802.15.4e, industrial internet of things (IIOT), scheduling mechanisms, wireless sensor networks (WSN)

Procedia PDF Downloads 157
962 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing

Authors: Carolina Gouveia, José Vieira, Pedro Pinho

Abstract:

The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.

Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR

Procedia PDF Downloads 134
961 Isolate-Specific Variations among Clinical Isolates of Brucella Identified by Whole-Genome Sequencing, Bioinformatics and Comparative Genomics

Authors: Abu S. Mustafa, Mohammad W. Khan, Faraz Shaheed Khan, Nazima Habibi

Abstract:

Brucellosis is a zoonotic disease of worldwide prevalence. There are at least four species and several strains of Brucella that cause human disease. Brucella genomes have very limited variation across strains, which hinder strain identification using classical molecular techniques, including PCR and 16 S rDNA sequencing. The aim of this study was to perform whole genome sequencing of clinical isolates of Brucella and perform bioinformatics and comparative genomics analyses to determine the existence of genetic differences across the isolates of a single Brucella species and strain. The draft sequence data were generated from 15 clinical isolates of Brucella melitensis (biovar 2 strain 63/9) using MiSeq next generation sequencing platform. The generated reads were used for further assembly and analysis. All the analysis was performed using Bioinformatics work station (8 core i7 processor, 8GB RAM with Bio-Linux operating system). FastQC was used to determine the quality of reads and low quality reads were trimmed or eliminated using Fastx_trimmer. Assembly was done by using Velvet and ABySS softwares. The ordering of assembled contigs was performed by Mauve. An online server RAST was employed to annotate the contigs assembly. Annotated genomes were compared using Mauve and ACT tools. The QC score for DNA sequence data, generated by MiSeq, was higher than 30 for 80% of reads with more than 100x coverage, which suggested that data could be utilized for further analysis. However when analyzed by FastQC, quality of four reads was not good enough for creating a complete genome draft so remaining 11 samples were used for further analysis. The comparative genome analyses showed that despite sharing same gene sets, single nucleotide polymorphisms and insertions/deletions existed across different genomes, which provided a variable extent of diversity to these bacteria. In conclusion, the next generation sequencing, bioinformatics, and comparative genome analysis can be utilized to find variations (point mutations, insertions and deletions) across different genomes of Brucella within a single strain. This information could be useful in surveillance and epidemiological studies supported by Kuwait University Research Sector grants MI04/15 and SRUL02/13.

Keywords: brucella, bioinformatics, comparative genomics, whole genome sequencing

Procedia PDF Downloads 375
960 Reconfigurable Consensus Achievement of Multi Agent Systems Subject to Actuator Faults in a Leaderless Architecture

Authors: F. Amirarfaei, K. Khorasani

Abstract:

In this paper, reconfigurable consensus achievement of a team of agents with marginally stable linear dynamics and single input channel has been considered. The control algorithm is based on a first order linear protocol. After occurrence of a LOE fault in one of the actuators, using the imperfect information of the effectiveness of the actuators from fault detection and identification module, the control gain is redesigned in a way to still reach consensus. The idea is based on the modeling of change in effectiveness as change of Laplacian matrix. Then as special cases of this class of systems, a team of single integrators as well as double integrators are considered and their behavior subject to a LOE fault is considered. The well-known relative measurements consensus protocol is applied to a leaderless team of single integrator as well as double integrator systems, and Gersgorin disk theorem is employed to determine whether fault occurrence has an effect on system stability and team consensus achievement or not. The analyses show that loss of effectiveness fault in actuator(s) of integrator systems affects neither system stability nor consensus achievement.

Keywords: multi-agent system, actuator fault, stability analysis, consensus achievement

Procedia PDF Downloads 330
959 Simulation of Focusing of Diamagnetic Particles in Ferrofluid Microflows with a Single Set of Overhead Permanent Magnets

Authors: Shuang Chen, Zongqian Shi, Jiajia Sun, Mingjia Li

Abstract:

Microfluidics is a technology that small amounts of fluids are manipulated using channels with dimensions of tens to hundreds of micrometers. At present, this significant technology is required for several applications in some fields, including disease diagnostics, genetic engineering, and environmental monitoring, etc. Among these fields, manipulation of microparticles and cells in microfluidic device, especially separation, have aroused general concern. In magnetic field, the separation methods include positive and negative magnetophoresis. By comparison, negative magnetophoresis is a label-free technology. It has many advantages, e.g., easy operation, low cost, and simple design. Before the separation of particles or cells, focusing them into a single tight stream is usually a necessary upstream operation. In this work, the focusing of diamagnetic particles in ferrofluid microflows with a single set of overhead permanent magnets is investigated numerically. The geometric model of the simulation is based on the configuration of previous experiments. The straight microchannel is 24mm long and has a rectangular cross-section of 100μm in width and 50μm in depth. The spherical diamagnetic particles of 10μm in diameter are suspended into ferrofluid. The initial concentration of the ferrofluid c₀ is 0.096%, and the flow rate of the ferrofluid is 1.8mL/h. The magnetic field is induced by five identical rectangular neodymium−iron− boron permanent magnets (1/8 × 1/8 × 1/8 in.), and it is calculated by equivalent charge source (ECS) method. The flow of the ferrofluid is governed by the Navier–Stokes equations. The trajectories of particles are solved by the discrete phase model (DPM) in the ANSYS FLUENT program. The positions of diamagnetic particles are recorded by transient simulation. Compared with the results of the mentioned experiments, our simulation shows consistent results that diamagnetic particles are gradually focused in ferrofluid under magnetic field. Besides, the diamagnetic particle focusing is studied by varying the flow rate of the ferrofluid. It is in agreement with the experiment that the diamagnetic particle focusing is better with the increase of the flow rate. Furthermore, it is investigated that the diamagnetic particle focusing is affected by other factors, e.g., the width and depth of the microchannel, the concentration of the ferrofluid and the diameter of diamagnetic particles.

Keywords: diamagnetic particle, focusing, microfluidics, permanent magnet

Procedia PDF Downloads 126
958 Identity Management in Virtual Worlds Based on Biometrics Watermarking

Authors: S. Bader, N. Essoukri Ben Amara

Abstract:

With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.

Keywords: identity management, security, biometrics authentication and authorization, avatar, virtual world

Procedia PDF Downloads 261
957 An Image Processing Based Approach for Assessing Wheelchair Cushions

Authors: B. Farahani, R. Fadil, A. Aboonabi, B. Hoffmann, J. Loscheider, K. Tavakolian, S. Arzanpour

Abstract:

Wheelchair users spend long hours in a sitting position, and selecting the right cushion is highly critical in preventing pressure ulcers in that demographic. Pressure mapping systems (PMS) are typically used in clinical settings by therapists to identify the sitting profile and pressure points in the sitting area to select the cushion that fits the best for the users. A PMS is a flexible mat composed of arrays of distributed networks of flexible sensors. The output of the PMS systems is a color-coded image that shows the intensity of the pressure concentration. Therapists use the PMS images to compare different cushions fit for each user. This process is highly subjective and requires good visual memory for the best outcome. This paper aims to develop an image processing technique to analyze the images of PMS and provide an objective measure to assess the cushions based on their pressure distribution mappings. In this paper, we first reviewed the skeletal anatomy of the human sitting area and its relation to the PMS image. This knowledge is then used to identify the important features that must be considered in image processing. We then developed an algorithm based on those features to analyze the images and rank them according to their fit to the users' needs.

Keywords: dynamic cushion, image processing, pressure mapping system, wheelchair

Procedia PDF Downloads 167
956 Study of Effects of 3D Semi-Spheriacl Basin-Shape-Ratio on the Frequency Content and Spectral Amplitudes of the Basin-Generated Surface Waves

Authors: Kamal, J. P. Narayan

Abstract:

In the present wok the effects of basin-shape-ratio on the frequency content and spectral amplitudes of the basin-generated surface waves and the associated spatial variation of ground motion amplification and differential ground motion in a 3D semi-spherical basin has been studied. A recently developed 3D fourth-order spatial accurate time-domain finite-difference (FD) algorithm based on the parsimonious staggered-grid approximation of the 3D viscoelastic wave equations was used to estimate seismic responses. The simulated results demonstrated the increase of both the frequency content and the spectral amplitudes of the basin-generated surface waves and the duration of ground motion in the basin with the increase of shape-ratio of semi-spherical basin. An increase of the average spectral amplification (ASA), differential ground motion (DGM) and the average aggravation factor (AAF) towards the centre of the semi-spherical basin was obtained.

Keywords: 3D viscoelastic simulation, basin-generated surface waves, basin-shape-ratio effects, average spectral amplification, aggravation factors and differential ground motion

Procedia PDF Downloads 500
955 Effect of Auraptene on the Enzymatic Glutathione Redox-System in Nrf2 Knockout Mice

Authors: Ludmila A. Gavriliuc, Jerry McLarty, Heather E. Kleiner, J. Michael Mathis

Abstract:

Abstract -- Background: The citrus coumarine Auraptene (Aur) is an effective chemopreventive agent, as manifested in many models of diseases and cancer. Nuclear factor erythroid 2-related factor (Nrf2) is an important regulator of genes induced by oxidative stress, such as glutathione S-transferases, heme oxygenase-1, and peroxiredoxin 1, by activating the antioxidant response element (ARE). Genetic and biochemical evidence has demonstrated that glutathione (GSH) and glutathione-dependent enzymes, glutathione reductase (GR), glutathione peroxidases (GPs), glutathione S-transferases (GSTs) are responsible for the control of intracellular reduction-oxidation status and participate in cellular adaptation to oxidative stress. The effect of Aur on the activity of GR, GPs (Se-GP and Se-iGP), and content of GSH in the liver, kidney, and spleen is insufficiently explored. Aim: Our goal was the examination of the Aur influence on the redox-system of GSH in Nrf2 wild type and Nrf2 knockout mice via activation of Nrf2 and ARE. Methods: Twenty female mice, 10 Nrf2 wild-type (WT) and 10 Nrf2 (-/-) knockout (KO), were bred and genotyped for our study. The activity of GR, Se-GP, Se-iGP, GST, G6PD, CytP450 reductase, catalase (Cat), and content of GSH were analyzed in the liver, kidney, and spleen using Spectrophotometry methods. The results of the specific activity of enzymes and the amount of GSH were analyzed with ANOVA and Spearman statistical methods. Results: Aur (200 mg/kg) treatment induced hepatic GST, GR, Se-GP activity and inhibited their activity in the spleen of mice, most likely via activation of the ARE through Nrf2. Activation in kidney Se-GP and G6PD by Aur is also controlled, apparently through Nrf2. Results of the non-parametric Spearman correlation analysis indicated the strong positive correlation between GR and G6PD only in the liver in WT control mice (r=+0.972; p < 0.005) and in the kidney KO control mice (r=+0.958; p < 0.005). The observed low content of GSH in the liver of KO mice indicated an increase in its participation in the neutralization of toxic substances with the absence of induction of GSH-dependent enzymes, such as GST, GR, Se-GP, and Se-iGP. Activation of CytP450 in kidney and spleen and Cat in the liver in KO mice probably revealed another regulatory mechanism for these enzymes. Conclusion: Thereby, obtained results testify that Aur can modulate the activity of genes and antioxidant enzymatic redox-system of GSH, responsible for the control of intracellular reduction-oxidation status.

Keywords: auraptene, glutathione, GST, Nrf2

Procedia PDF Downloads 144
954 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control

Authors: Ming-Yen Chang, Sheng-Hung Ke

Abstract:

This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.

Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride

Procedia PDF Downloads 65
953 Risk Mitigation of Data Causality Analysis Requirements AI Act

Authors: Raphaël Weuts, Mykyta Petik, Anton Vedder

Abstract:

Artificial Intelligence has the potential to create and already creates enormous value in healthcare. Prescriptive systems might be able to make the use of healthcare capacity more efficient. Such systems might entail interpretations that exclude the effect of confounders that brings risks with it. Those risks might be mitigated by regulation that prevents systems entailing such risks to come to market. One modality of regulation is that of legislation, and the European AI Act is an example of such a regulatory instrument that might mitigate these risks. To assess the risk mitigation potential of the AI Act for those risks, this research focusses on a case study of a hypothetical application of medical device software that entails the aforementioned risks. The AI Act refers to the harmonised norms for already existing legislation, here being the European medical device regulation. The issue at hand is a causal link between a confounder and the value the algorithm optimises for by proxy. The research identifies where the AI Act already looks at confounders (i.a. feedback loops in systems that continue to learn after being placed on the market). The research identifies where the current proposal by parliament leaves legal uncertainty on the necessity to check for confounders that do not influence the input of the system, when the system does not continue to learn after being placed on the market. The authors propose an amendment to article 15 of the AI Act that would require high-risk systems to be developed in such a way as to mitigate risks from those aforementioned confounders.

Keywords: AI Act, healthcare, confounders, risks

Procedia PDF Downloads 257
952 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 409
951 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 146
950 Metagenomics Analysis on Microbial Communities of Sewage Sludge from Nyeri-Kangemi Wastewater Treatment Plant, Nyeri County-Kenya

Authors: Allan Kiptanui Kimisto, Geoffrey Odhiambo Ongondo, Anastasia Wairimu Muia, Cyrus Ndungu Kimani

Abstract:

The major challenge to proper sewage sludge treatment processes is the poor understanding of sludge microbiome diversities. This study applied the whole-genome. shotgun metagenomics technique to profile the microbial composition of sewage sludge in two active digestion lagoons at the Nyeri-Kangemi Wastewater Treatment Plant in Nyeri County, Kenya. Total microbial community DNA was extracted from samples using the available ZymoBIOMICS™ DNA Miniprep Kit and sequenced using Shotgun metagenomics. Samples were analyzed using MG-RAST software (Project ID: mgp100988), which allowed for comparing taxonomic diversity before β-diversities studies for Bacteria, Archaea and Eukaryotes. The study identified 57 phyla, 145 classes, 301 orders, 506 families, 963 genera, and 1980 species. Bacteria dominated the microbes and comprised 28 species, 51 classes, 110 orders, 243 families, 597 genera, and 1518 species. The Bacteroides(6.77%) were dominant, followed by Acinetobacter(1.44%) belonging to the Gammaproteobacteria and Acidororax (1.36%), Bacillus (1.24%) and Clostridium (1.02%) belonging to Betaproteobacteria. Archaea recorded 5 phyla, 13 classes, 19 orders, 29 families, 60 genera,and87 species, with the dominant genera being Methanospirillum (16.01%), methanosarcina (15.70%), and Methanoregula(14.80%) and Methanosaeta (8.74%), Methanosphaerula(5.48%) and Methanobrevibacter(5.03%) being the subdominant group. The eukaryotes were the least in abundance and comprised 24 phyla, 81 classes, 301 orders, 506 families, 963 genera, and 980 species. Arabidopsis (4.91%) and Caenorhabditis (4.81%) dominated the eukaryotes, while Dityostelium (3.63%) and Drosophila(2.08%) were the subdominant genera. All these microbes play distinct roles in the anaerobic treatment process of sewage sludge. The local sludge microbial composition and abundance variations may be due to age difference differences between the two digestion lagoons in operation at the plant and the different degradation rales played by the taxa. The information presented in this study can help in the genetic manipulation or formulation of optimal microbial ratios to improve their effectiveness in sewage sludge treatment. This study recommends further research on how the different taxa respond to environmental changes over time and space.

Keywords: shotgun metagenomics, sludge, bacteria, archaea, eukaryotes

Procedia PDF Downloads 92
949 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine

Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji

Abstract:

The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.

Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis

Procedia PDF Downloads 315
948 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 66
947 State Estimation of a Biotechnological Process Using Extended Kalman Filter and Particle Filter

Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte, V. Grincas

Abstract:

This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.

Keywords: biomass concentration, extended Kalman filter, particle filter, state estimation, specific growth rate

Procedia PDF Downloads 420
946 Development of Trigger Tool to Identify Adverse Drug Events From Warfarin Administered to Patient Admitted in Medical Wards of Chumphae Hospital

Authors: Puntarikorn Rungrattanakasin

Abstract:

Objectives: To develop the trigger tool to warn about the risk of bleeding as an adverse event from warfarin drug usage during admission in Medical Wards of Chumphae Hospital. Methods: A retrospective study was performed by reviewing the medical records for the patients admitted between June 1st,2020- May 31st, 2021. ADEs were evaluated by Naranjo’s algorithm. The international normalized ratio (INR) and events of bleeding during admissions were collected. Statistical analyses, including Chi-square test and Reciever Operating Characteristic (ROC) curve for optimal INR threshold, were used for the study. Results: Among the 139 admissions, the INR range was found to vary between 0.86-14.91, there was a total of 15 bleeding events, out of which 9 were mild, and 6 were severe. The occurrence of bleeding started whenever the INR was greater than 2.5 and reached the statistical significance (p <0.05), which was in concordance with the ROC curve and yielded 100 % sensitivity and 60% specificity in the detection of a bleeding event. In this regard, the INR greater than 2.5 was considered to be an optimal threshold to alert promptly for bleeding tendency. Conclusions: The INR value of greater than 2.5 (>2.5) would be an appropriate trigger tool to warn of the risk of bleeding for patients taking warfarin in Chumphae Hospital.

Keywords: trigger tool, warfarin, risk of bleeding, medical wards

Procedia PDF Downloads 144
945 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 89
944 Clostridium thermocellum DBT-IOC-C19, A Potential CBP Isolate for Ethanol Production

Authors: Nisha Singh, Munish Puri, Collin Barrow, Deepak Tuli, Anshu S. Mathur

Abstract:

The biological conversion of lignocellulosic biomass to ethanol is a promising strategy to solve the present global crisis of exhausting fossil fuels. The existing bioethanol production technologies have cost constraints due to the involvement of mandate pretreatment and extensive enzyme production steps. A unique process configuration known as consolidated bioprocessing (CBP) is believed to be a potential cost-effective process due to its efficient integration of enzyme production, saccharification, and fermentation into one step. Due to several favorable reasons like single step conversion, no need of adding exogenous enzymes and facilitated product recovery, CBP has gained the attention of researchers worldwide. However, there are several technical and economic barriers which need to be overcome for making consolidated bioprocessing a commercially viable process. Finding a natural candidate CBP organism is critically important and thermophilic anaerobes are preferred microorganisms. The thermophilic anaerobes that can represent CBP mainly belong to genus Clostridium, Caldicellulosiruptor, Thermoanaerobacter, Thermoanaero bacterium, and Geobacillus etc. Amongst them, Clostridium thermocellum has received increased attention as a high utility CBP candidate due to its highest growth rate on crystalline cellulose, the presence of highly efficient cellulosome system and ability to produce ethanol directly from cellulose. Recently with the availability of genetic and molecular tools aiding the metabolic engineering of Clostridium thermocellum have further facilitated the viability of commercial CBP process. With this view, we have specifically screened cellulolytic and xylanolytic thermophilic anaerobic ethanol producing bacteria, from unexplored hot spring/s in India. One of the isolates is a potential CBP organism identified as a new strain of Clostridium thermocellum. This strain has shown superior avicel and xylan degradation under unoptimized conditions compared to reported wild type strains of Clostridium thermocellum and produced more than 50 mM ethanol in 72 hours from 1 % avicel at 60°C. Besides, this strain shows good ethanol tolerance and growth on both hexose and pentose sugars. Hence, with further optimization this new strain could be developed as a potential CBP microbe.

Keywords: Clostridium thermocellum, consolidated bioprocessing, ethanol, thermophilic anaerobes

Procedia PDF Downloads 396
943 An Emergence of Pinus taeda Needle Defoliation and Tree Mortality in Alabama, USA

Authors: Debit Datta, Jeffrey J. Coleman, Scott A. Enebak, Lori G. Eckhardt

Abstract:

Pinus taeda, commonly known as loblolly pine, is a crucial timber species native to the southeastern USA. An emerging problem has been encountered for the past few years, which is better to be known as loblolly pine needle defoliation (LPND), which is threatening the ecological health of southeastern forests and economic vitality of the region’s timber industry. Currently, more than 1000 hectares of loblolly plantations in Alabama are affected with similar symptoms and have created concern among southeast landowners and forest managers. However, it is still uncertain whether LPND results from one or the combination of several fungal pathogens. Therefore, the objectives of the study were to identify and characterize the fungi associated with LPND in the southeastern USA and document the damage being done to loblolly pine as a result of repeated defoliation. Identification of fungi was confirmed using classical morphological methods (microscopic examination of the infected needles), conventional and species-specific priming (SSPP) PCR, and ITS sequencing. To date, 17 species of fungi, either cultured from pine needles or formed fruiting bodies on pine needles, were identified based on morphology and genetic sequence data. Among them, brown-spot pathogen Lecanostica acicola has been frequently recovered from pine needles in both spring and summer. Moreover, Ophistomatoid fungi such as Leptographium procerum, L. terebrantis are associated with pine decline have also been recovered from root samples of the infected stands. Trees have been increasingly and repeatedly chlorotic and defoliated from 2019 to 2020. Based on morphological observations and molecular data, emerging loblolly pine needle defoliation is due in larger part to the brown-spot pathogen L. acoicola followed by pine decline pathogens L. procerum and L. terebrantis. Root pathogens were suspected to emerge later, and their cumulative effects contribute to the widespread mortality of the trees. It is more likely that longer wet spring and warmer temperatures are favorable to disease development and may be important in the disease ecology of LPND. Therefore, the outbreak of the disease is assumed to be expanded over a large geographical area in a changing climatic condition.

Keywords: brown-spot fungi, emerging disease, defoliation, loblolly pine

Procedia PDF Downloads 134
942 Application of Deep Neural Networks to Assess Corporate Credit Rating

Authors: Parisa Golbayani, Dan Wang, Ionut¸ Florescu

Abstract:

In this work we implement machine learning techniques to financial statement reports in order to asses company’s credit rating. Specifically, the work analyzes the performance of four neural network architectures (MLP, CNN, CNN2D, LSTM) in predicting corporate credit rating as issued by Standard and Poor’s. The paper focuses on companies from the energy, financial, and healthcare sectors in the US. The goal of this analysis is to improve application of machine learning algorithms to credit assessment. To accomplish this, the study investigates three questions. First, we investigate if the algorithms perform better when using a selected subset of important features or whether better performance is obtained by allowing the algorithms to select features themselves. Second, we address the temporal aspect inherent in financial data and study whether it is important for the results obtained by a machine learning algorithm. Third, we aim to answer if one of the four particular neural network architectures considered consistently outperforms the others, and if so under which conditions. This work frames the problem as several case studies to answer these questions and analyze the results using ANOVA and multiple comparison testing procedures.

Keywords: convolutional neural network, long short term memory, multilayer perceptron, credit rating

Procedia PDF Downloads 234
941 Ultrastructural Characterization of Lipid Droplets of Rat Hepatocytes after Whole Body 60-Cobalt Gamma Radiation

Authors: Ivna Mororó, Lise P. Labéjof, Stephanie Ribeiro, Kely Almeida

Abstract:

Lipid droplets (LDs) are normally presented in greater or lesser number in the cytoplasm of almost all eukaryotic and some prokaryotic cells. They are independent organelles composed of a lipid ester core and a surface phospholipid monolayer. As a lipid storage form, they provide an available source of energy for the cell. Recently it was demonstrated that they play an important role in other many cellular processes. Among the many unresolved questions about them, it is not even known how LDs is formed, how lipids are recruited to LDs and how they interact with the other organelles. Excess fat in the organism is pathological and often associated with the development of some genetic, hormonal or behavioral diseases. The formation and accumulation of lipid droplets in the cytoplasm can be increased by exogenous physical or chemical agents. It is well known that ionizing radiation affects lipid metabolism resulting in increased lipogenesis in cells, but the details of this process are unknown. To better understand the mode of formation of LDs in liver cells, we investigate their ultrastructural morphology after irradiation. For that, Wistar rats were exposed to whole body gamma radiation from 60-cobalt at various single doses. Samples of the livers were processed for analysis under a conventional transmission electron microscope. We found that when compared to controls, morphological changes in liver cells were evident at the higher doses of radiation used. It was detected a great number of lipid droplets of different sizes and homogeneous content and some of them merged each other. In some cells, it was observed diffused LDs, not limited by a monolayer of phospholipids. This finding suggests that the phospholipid monolayer of the LDs was disrupted by ionizing radiation exposure that promotes lipid peroxydation of endo membranes. Thus the absence of the phospholipid monolayer may prevent the realization of some cellular activities as follow: - lipid exocytosis which requires the merging of LDs membrane with the plasma membrane; - the interaction of LDs with other membrane-bound organelles such as the endoplasmic reticulum (ER), the golgi and mitochondria and; - lipolysis of lipid esters contained in the LDs which requires the presence of enzymes located in membrane-bound organelles as ER. All these impediments can contribute to lipid accumulation in the cytoplasm and the development of diseases such as liver steatosis, cirrhosis and cancer.

Keywords: radiobiology, hepatocytes, lipid metabolism, transmission electron microscopy

Procedia PDF Downloads 309
940 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 360
939 System for the Detecting of Fake Profiles on Online Social Networks Using Machine Learning and the Bio-Inspired Algorithms

Authors: Sekkal Nawel, Mahammed Nadir

Abstract:

The proliferation of online activities on Online Social Networks (OSNs) has captured significant user attention. However, this growth has been hindered by the emergence of fraudulent accounts that do not represent real individuals and violate privacy regulations within social network communities. Consequently, it is imperative to identify and remove these profiles to enhance the security of OSN users. In recent years, researchers have turned to machine learning (ML) to develop strategies and methods to tackle this issue. Numerous studies have been conducted in this field to compare various ML-based techniques. However, the existing literature still lacks a comprehensive examination, especially considering different OSN platforms. Additionally, the utilization of bio-inspired algorithms has been largely overlooked. Our study conducts an extensive comparison analysis of various fake profile detection techniques in online social networks. The results of our study indicate that supervised models, along with other machine learning techniques, as well as unsupervised models, are effective for detecting false profiles in social media. To achieve optimal results, we have incorporated six bio-inspired algorithms to enhance the performance of fake profile identification results.

Keywords: machine learning, bio-inspired algorithm, detection, fake profile, system, social network

Procedia PDF Downloads 64
938 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 152
937 Grain Size Characteristics and Sediments Distribution in the Eastern Part of Lekki Lagoon

Authors: Mayowa Philips Ibitola, Abe Oluwaseun Banji, Olorunfemi Akinade-Solomon

Abstract:

A total of 20 bottom sediment samples were collected from the Lekki Lagoon during the wet and dry season. The study was carried out to determine the textural characteristics, sediment distribution pattern and energy of transportation within the lagoon system. The sediment grain sizes and depth profiling was analyzed using dry sieving method and MATLAB algorithm for processing. The granulometric reveals fine grained sand both for the wet and dry season with an average mean value of 2.03 ϕ and -2.88 ϕ, respectively. Sediments were moderately sorted with an average inclusive standard deviation of 0.77 ϕ and -0.82 ϕ. Skewness varied from strongly coarse and near symmetrical 0.34- ϕ and 0.09 ϕ. The kurtosis average value was 0.87 ϕ and -1.4 ϕ (platykurtic and leptokurtic). Entirely, the bathymetry shows an average depth of 4.0 m. The deepest and shallowest area has a depth of 11.2 m and 0.5 m, respectively. High concentration of fine sand was observed at deep areas compared to the shallow areas during wet and dry season. Statistical parameter results show that the overall sediments are sorted, and deposited under low energy condition over a long distance. However, sediment distribution and sediment transport pattern of Lekki Lagoon is controlled by a low energy current and the down slope configuration of the bathymetry enhances the sorting and the deposition rate in the Lekki Lagoon.

Keywords: Lekki Lagoon, Marine sediment, bathymetry, grain size distribution

Procedia PDF Downloads 228
936 Effect of Atmospheric Turbulence on Hybrid FSO/RF Link Availability under Qatar's Harsh Climate

Authors: Abir Touati, Syed Jawad Hussain, Farid Touati, Ammar Bouallegue

Abstract:

Although there has been a growing interest in the hybrid free-space optical link and radio frequency FSO/RF communication system, the current literature is limited to results obtained in moderate or cold environment. In this paper, using a soft switching approach, we investigate the effect of weather inhomogeneities on the strength of turbulence hence the channel refractive index under Qatar harsh environment and their influence on the hybrid FSO/RF availability. In this approach, either FSO/RF or simultaneous or none of them can be active. Based on soft switching approach and a finite state Markov Chain (FSMC) process, we model the channel fading for the two links and derive a mathematical expression for the outage probability of the hybrid system. Then, we evaluate the behavior of the hybrid FSO/RF under hazy and harsh weather. Results show that the FSO/RF soft switching renders the system outage probability less than that of each link individually. A soft switching algorithm is being implemented on FPGAs using Raptor code interfaced to the two terminals of a 1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented in the region. Experimental results are compared to the above simulation results.

Keywords: atmospheric turbulence, haze, hybrid FSO/RF, outage probability, refractive index

Procedia PDF Downloads 415