Search results for: modified simplex algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5900

Search results for: modified simplex algorithm

1370 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing

Authors: Carolina Gouveia, José Vieira, Pedro Pinho

Abstract:

The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.

Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR

Procedia PDF Downloads 145
1369 SARS-CoV-2 Transmission Risk Factors among Patients from a Metropolitan Community Health Center, Puerto Rico, July 2020 to March 2022

Authors: Juan C. Reyes, Linnette Rodríguez, Héctor Villanueva, Jorge Vázquez, Ivonne Rivera

Abstract:

On July 2020, a private non-profit community health center (HealthProMed) that serves people without a medical insurance plan or with limited resources in one of the most populated areas in San Juan, Puerto Rico, implemented a COVID-19 case investigation and contact-tracing surveillance system. Nursing personnel at the health center completed a computerized case investigation form that was translated, adapted, and modified from CDC’s Patient Under Investigation (PUI) Form. Between July 13, 2020, and March 17, 2022, a total of 9,233 SARS-CoV-2 tests were conducted at the health center, 16.9% of which were classified as confirmed cases (positive molecular test) and 27.7% as probable cases (positive serologic test). Most of the confirmed cases were females (60.0%), under 20 years old (29.1%), and living in their homes (59.1%). In the 14 days before the onset of symptoms, 26.3% of confirmed cases reported going to the supermarket, 22.4% had contact with a known COVID-19 case, and 20.7% went to work. The symptoms most commonly reported were sore throat (33.4%), runny nose (33.3%), cough (24.9%), and headache (23.2%). The most common preexisting medical conditions among confirmed cases were hypertension (19.3%), chronic lung disease including asthma, emphysema, COPD (13.3%), and diabetes mellitus (12.8). Multiple logistic regression analysis revealed that patients who used alcohol frequently during the last two weeks (OR=1.43; 95%CI: 1.15-1.77), those who were in contact with a positive case (OR=1.58; 95%CI: 1.33-1.88) and those who were obese (OR=1.82; 95%CI: 1.24-2.69) were significantly more likely to be a confirmed case after controlling for sociodemographic variables. Implementing a case investigation and contact-tracing component at community health centers can be of great value in the prevention and control of COVID-19 at the community level and could be used in future outbreaks.

Keywords: community health center, Puerto Rico, risk factors, SARS-CoV-2

Procedia PDF Downloads 123
1368 Optimisation of Structural Design by Integrating Genetic Algorithms in the Building Information Modelling Environment

Authors: Tofigh Hamidavi, Sepehr Abrishami, Pasquale Ponterosso, David Begg

Abstract:

Structural design and analysis is an important and time-consuming process, particularly at the conceptual design stage. Decisions made at this stage can have an enormous effect on the entire project, as it becomes ever costlier and more difficult to alter the choices made early on in the construction process. Hence, optimisation of the early stages of structural design can provide important efficiencies in terms of cost and time. This paper suggests a structural design optimisation (SDO) framework in which Genetic Algorithms (GAs) may be used to semi-automate the production and optimisation of early structural design alternatives. This framework has the potential to leverage conceptual structural design innovation in Architecture, Engineering and Construction (AEC) projects. Moreover, this framework improves the collaboration between the architectural stage and the structural stage. It will be shown that this SDO framework can make this achievable by generating the structural model based on the extracted data from the architectural model. At the moment, the proposed SDO framework is in the process of validation, involving the distribution of an online questionnaire among structural engineers in the UK.

Keywords: building information, modelling, BIM, genetic algorithm, GA, architecture-engineering-construction, AEC, optimisation, structure, design, population, generation, selection, mutation, crossover, offspring

Procedia PDF Downloads 247
1367 Study on Seismic Performance of Reinforced Soil Walls in Order to Offer Modified Pseudo Static Method

Authors: Majid Yazdandoust

Abstract:

This study, tries to suggest a design method based on displacement using finite difference numerical modeling in reinforcing soil retaining wall with steel strip. In this case, dynamic loading characteristics such as duration, frequency, peak ground acceleration, geometrical characteristics of reinforced soil structure and type of the site are considered to correct the pseudo static method and finally introduce the pseudo static coefficient as a function of seismic performance level and peak ground acceleration. For this purpose, the influence of dynamic loading characteristics, reinforcement length, height of reinforced system and type of the site are investigated on seismic behavior of reinforcing soil retaining wall with steel strip. Numerical results illustrate that the seismic response of this type of wall is highly dependent to cumulative absolute velocity, maximum acceleration, and height and reinforcement length so that the reinforcement length can be introduced as the main factor in shape of failure. Considering the loading parameters, mechanically stabilized earth wall parameters and type of the site showed that the used method in this study leads to most efficient designs in comparison with other methods which are generally suggested in cods that are usually based on limit-equilibrium concept. The outputs show the over-estimation of equilibrium design methods in comparison with proposed displacement based methods here.

Keywords: pseudo static coefficient, seismic performance design, numerical modeling, steel strip reinforcement, retaining walls, cumulative absolute velocity, failure shape

Procedia PDF Downloads 487
1366 Elaboration of Sustainable Luminescence Material Based on Rare Earth Complexes for Solar Energy Conversion

Authors: Othmane Essahili, Mohamed Ilsouk, Carine Duhayon, Omar Moudam

Abstract:

Due to their excellent and promising properties, a great deal of attention has recently been devoted to luminescent materials, particularly those utilizing rare earth elements. These materials play an essential role in low-cost energy conversion technology applications, such as luminescent solar concentrators (LSCs). They also have potential applications in Agri-PV systems and smart building windows. Luminescent materials based on europium (III) complexes are known for their high luminescence efficiency, long fluorescence lifetimes, and sharp emission bands. However, they present certain drawbacks related to their limited absorption capacity due to the forbidden 4f-4f electronic transitions. To address these drawbacks, using β-diketonate ligands as sensitizers appears as a promising solution to enhance luminescence intensity through the antenna effect, where the ligand's excited energy is transferred to the europium ions. In this study, we synthesized β-diketonate-based europium complexes with phenanthroline derivatives, modified with various methyl groups, to examine their effects on the complexes' stability in poly(methyl methacrylate) (PMMA) films. Our findings reveal that these complexes exhibit remarkable red emission and high photoluminescence quantum yield. Stability tests under different conditions for 1200 hours showed that complexes with a higher number of methyl substitutions offer improved photoluminescent stability and resistance to degradation, particularly in outdoor settings. This research underscores the potential of chemically tuned phenanthroline ligands in developing stable, efficient luminescent materials for future optoelectronic devices, including efficient and durable LSCs.

Keywords: luminescent materials, photochemistry, luminescent solar concentrators, β-diketonate-based europium complexes

Procedia PDF Downloads 69
1365 Performance Evaluation of Composite Beam under Uniform Corrosion

Authors: Ririt Aprilin Sumarsono

Abstract:

Composite member (concrete and steel) has been widely advanced for structural utilization due to its best performance in resisting load, reducing the total weight of the structure, increasing stiffness, and other available advantages. On the other hand, the environment load such as corrosion (e.g. chloride ingress) creates significant time-dependent degradation for steel. Analysis performed in this paper is mainly considered uniform corrosion for evaluating the composite beam without examining the pit corrosion as the initial corrosion formed. Corrosion level in terms of weight loss is modified in yield stress and modulus elasticity of steel. Those two mechanical properties are utilized in this paper for observing the stresses due to corrosion attacked. As corrosion level increases, the effective width of the composite beam in the concrete section will be wider. The position of a neutral axis of composite section will indicate the composite action due to corrosion of composite beam so that numerous shear connectors provided must be reconsidered. Flexure capacity quantification provides stresses, and shear capacity calculation derives connectors needed in overcoming the shear problem for composite beam under corrosion. A model of simply supported composite beam examined in this paper under uniform corrosion where the stresses as the focus of the evaluation. Principal stress at the first stage of composite construction decline as the corrosion level incline, parallel for the second stage stress analysis where the tension region held by the steel undergoes lower capacity due to corrosion. Total stresses of the composite section for steel to be born significantly decreases particularly in the outermost fiber of tension side. Whereas, the available compression side is smaller as the corrosion level increases so that the stress occurs on the compression side shows reduction as well. As a conclusion, the increment of corrosion level will degrade both compression and tension side of stresses.

Keywords: composite beam, modulus of elasticity, stress analysis, yield strength, uniform corrosion

Procedia PDF Downloads 289
1364 Size Effects on Structural Performance of Concrete Gravity Dams

Authors: Mehmet Akköse

Abstract:

Concern about seismic safety of concrete dams have been growing around the world, partly because the population at risk in locations downstream of major dams continues to expand and also because it is increasingly evident that the seismic design concepts in use at the time most existing dams were built were inadequate. Most of the investigations in the past have been conducted on large dams, typically above 100m high. A large number of concrete dams in our country and in other parts of the world are less than 50m high. Most of these dams were usually designed using pseudo-static methods, ignoring the dynamic characteristics of the structure as well as the characteristics of the ground motion. Therefore, it is important to carry out investigations on seismic behavior this category of dam in order to assess and evaluate the safety of existing dams and improve the knowledge for different high dams to be constructed in the future. In this study, size effects on structural performance of concrete gravity dams subjected to near and far-fault ground motions are investigated including dam-water-foundation interaction. For this purpose, a benchmark problem proposed by ICOLD (International Committee on Large Dams) is chosen as a numerical application. Structural performance of the dam having five different heights is evaluated according to damage criterions in USACE (U.S. Army Corps of Engineers). It is decided according to their structural performance if non-linear analysis of the dams requires or not. The linear elastic dynamic analyses of the dams to near and far-fault ground motions are performed using the step-by-step integration technique. The integration time step is 0.0025 sec. The Rayleigh damping constants are calculated assuming 5% damping ratio. The program NONSAP modified for fluid-structure systems with the Lagrangian fluid finite element is employed in the response calculations.

Keywords: concrete gravity dams, Lagrangian approach, near and far-fault ground motion, USACE damage criterions

Procedia PDF Downloads 269
1363 Reconfigurable Consensus Achievement of Multi Agent Systems Subject to Actuator Faults in a Leaderless Architecture

Authors: F. Amirarfaei, K. Khorasani

Abstract:

In this paper, reconfigurable consensus achievement of a team of agents with marginally stable linear dynamics and single input channel has been considered. The control algorithm is based on a first order linear protocol. After occurrence of a LOE fault in one of the actuators, using the imperfect information of the effectiveness of the actuators from fault detection and identification module, the control gain is redesigned in a way to still reach consensus. The idea is based on the modeling of change in effectiveness as change of Laplacian matrix. Then as special cases of this class of systems, a team of single integrators as well as double integrators are considered and their behavior subject to a LOE fault is considered. The well-known relative measurements consensus protocol is applied to a leaderless team of single integrator as well as double integrator systems, and Gersgorin disk theorem is employed to determine whether fault occurrence has an effect on system stability and team consensus achievement or not. The analyses show that loss of effectiveness fault in actuator(s) of integrator systems affects neither system stability nor consensus achievement.

Keywords: multi-agent system, actuator fault, stability analysis, consensus achievement

Procedia PDF Downloads 340
1362 Modelling, Simulation, and Experimental Validation of the Influence of Golf-Ball-Inspired Dimpled Design in Drag Reduction and Improved Fuel Efficiency of Super-Mileage Vehicle

Authors: Bibin Sagaram, Ronith Stanly, S. S. Suneesh

Abstract:

Due to the dwindling supply of fuel reserves, engineers and designers now focus on fuel efficient designs for the solution of any problem; the transportation industry is not new to this kind of approach. Though the aerodynamic benefits of the dimples on a Golf-ball are known, it has never been scientifically tested on how such a design philosophy can improve the fuel efficiency of a real-life vehicle by imparting better aerodynamic performance. The main purpose of the paper is to establish the aerodynamic benefits of the Golf-ball-Inspired Dimpled Design in improving the fuel efficiency of a Super-mileage vehicle, constructed by Team Go Viridis for ‘Shell Eco Marathon Asia 2015’, and to predict the extent to which the results can be held valid for a road car. The body design was modeled in Autodesk Inventor and the Computational Fluid Dynamics (CFD) simulations were carried out using Ansys Fluent software. The aerodynamic parameters of designs (with and without the Golf-ball-Inspired Dimples) have been studied and the results are experimentally validated against those obtained from wind tunnel tests carried out on a 1:10 scaled-down 3D printed model. Test drives of the Super-mileage vehicle were carried out, under various conditions, to compare the variation in fuel efficiency with and without the Golf-ball-Inspired design. Primary investigations reveal an aerodynamic advantage of 25% for the vehicle with the Golf Ball Inspired Dimpled Design as opposed to the normal design. Initial tests conducted by ‘Mythbusters’ on Discovery Network using a modified road car has shown positive results which has motivated us to conduct such a research work using a custom-built experimental Super-Mileage vehicle. The content of the paper becomes relevant to the present Automotive and Energy industry where improving the fuel efficiency is of the top most priority.

Keywords: aerodynamics, CFD, fuel efficiency, golf ball

Procedia PDF Downloads 335
1361 Effect of Acid Activation of Vermiculite on Its Carbon Dioxide Adsorption Behaviors

Authors: Katarzyna Wal, Wojciech Stawiński, Piotr Rutkowski

Abstract:

The scientific community is paying more and more attention to the problem of air pollution. Carbon dioxide is classified as one of the most harmful gases. Its emissions are generated during fossil fuel burning, waste management, and combustion and are responsible for global warming. Clay minerals constitute a group of promising materials for the role of adsorbents. They are composed of two types of phyllosilicate sheets: tetrahedral and octahedral, which form 1:1 or 2:1 structures. Vermiculite is one of their best-known representative, which can be used as an adsorbent from water and gaseous phase. The aim of the presented work was carbon dioxide adsorption on vermiculite. Acid-activated samples (W_NO3_x) were prepared by acid treatment with different concentrations of nitric acid (1, 2, 3, 4 mol L⁻¹). Vermiculite was subjected to modification in order to increase its porosity and adsorption properties. The prepared adsorbents were characterized using the BET-specific surface area analysis, thermogravimetry (TG), attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopy, X-ray diffraction (XRD) and scanning electron microscopy (SEM). Applied modifications significantly increase the specific surface area from 78,21 m² g⁻¹ for the unmodified sample (W_REF) to 536 m² g-1 for W_NO3_4. Obtained results showed that acid treatment tunes the material’s functional properties by increasing the contact surface and generating more active sites in its structure. The adsorption performance in terms carbon dioxide adsorption capacities follows the order of W_REF (25.91 mg g⁻¹) < W_NO3_1 (38.54 mg g⁻¹) < W_NO3_2 (44.03 mg g⁻¹) W_NO3_4 (67.51 mg g⁻¹) < W_NO3_3 (70.48 mg g⁻¹). Acid activation significantly improved the carbon dioxide adsorption properties of modified samples compared to raw material. These results demonstrate that vermiculite-based samples have the potential to be used as effective CO₂ adsorbents. Furthermore, acid treatment is a promising technique for improving the adsorption properties of clay minerals.

Keywords: adsorption, adsorbent, clay minerals, air pollution, environment

Procedia PDF Downloads 151
1360 Identity Management in Virtual Worlds Based on Biometrics Watermarking

Authors: S. Bader, N. Essoukri Ben Amara

Abstract:

With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.

Keywords: identity management, security, biometrics authentication and authorization, avatar, virtual world

Procedia PDF Downloads 269
1359 An Image Processing Based Approach for Assessing Wheelchair Cushions

Authors: B. Farahani, R. Fadil, A. Aboonabi, B. Hoffmann, J. Loscheider, K. Tavakolian, S. Arzanpour

Abstract:

Wheelchair users spend long hours in a sitting position, and selecting the right cushion is highly critical in preventing pressure ulcers in that demographic. Pressure mapping systems (PMS) are typically used in clinical settings by therapists to identify the sitting profile and pressure points in the sitting area to select the cushion that fits the best for the users. A PMS is a flexible mat composed of arrays of distributed networks of flexible sensors. The output of the PMS systems is a color-coded image that shows the intensity of the pressure concentration. Therapists use the PMS images to compare different cushions fit for each user. This process is highly subjective and requires good visual memory for the best outcome. This paper aims to develop an image processing technique to analyze the images of PMS and provide an objective measure to assess the cushions based on their pressure distribution mappings. In this paper, we first reviewed the skeletal anatomy of the human sitting area and its relation to the PMS image. This knowledge is then used to identify the important features that must be considered in image processing. We then developed an algorithm based on those features to analyze the images and rank them according to their fit to the users' needs.

Keywords: dynamic cushion, image processing, pressure mapping system, wheelchair

Procedia PDF Downloads 173
1358 Study of Effects of 3D Semi-Spheriacl Basin-Shape-Ratio on the Frequency Content and Spectral Amplitudes of the Basin-Generated Surface Waves

Authors: Kamal, J. P. Narayan

Abstract:

In the present wok the effects of basin-shape-ratio on the frequency content and spectral amplitudes of the basin-generated surface waves and the associated spatial variation of ground motion amplification and differential ground motion in a 3D semi-spherical basin has been studied. A recently developed 3D fourth-order spatial accurate time-domain finite-difference (FD) algorithm based on the parsimonious staggered-grid approximation of the 3D viscoelastic wave equations was used to estimate seismic responses. The simulated results demonstrated the increase of both the frequency content and the spectral amplitudes of the basin-generated surface waves and the duration of ground motion in the basin with the increase of shape-ratio of semi-spherical basin. An increase of the average spectral amplification (ASA), differential ground motion (DGM) and the average aggravation factor (AAF) towards the centre of the semi-spherical basin was obtained.

Keywords: 3D viscoelastic simulation, basin-generated surface waves, basin-shape-ratio effects, average spectral amplification, aggravation factors and differential ground motion

Procedia PDF Downloads 512
1357 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control

Authors: Ming-Yen Chang, Sheng-Hung Ke

Abstract:

This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.

Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride

Procedia PDF Downloads 71
1356 Risk Mitigation of Data Causality Analysis Requirements AI Act

Authors: Raphaël Weuts, Mykyta Petik, Anton Vedder

Abstract:

Artificial Intelligence has the potential to create and already creates enormous value in healthcare. Prescriptive systems might be able to make the use of healthcare capacity more efficient. Such systems might entail interpretations that exclude the effect of confounders that brings risks with it. Those risks might be mitigated by regulation that prevents systems entailing such risks to come to market. One modality of regulation is that of legislation, and the European AI Act is an example of such a regulatory instrument that might mitigate these risks. To assess the risk mitigation potential of the AI Act for those risks, this research focusses on a case study of a hypothetical application of medical device software that entails the aforementioned risks. The AI Act refers to the harmonised norms for already existing legislation, here being the European medical device regulation. The issue at hand is a causal link between a confounder and the value the algorithm optimises for by proxy. The research identifies where the AI Act already looks at confounders (i.a. feedback loops in systems that continue to learn after being placed on the market). The research identifies where the current proposal by parliament leaves legal uncertainty on the necessity to check for confounders that do not influence the input of the system, when the system does not continue to learn after being placed on the market. The authors propose an amendment to article 15 of the AI Act that would require high-risk systems to be developed in such a way as to mitigate risks from those aforementioned confounders.

Keywords: AI Act, healthcare, confounders, risks

Procedia PDF Downloads 265
1355 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 417
1354 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 157
1353 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine

Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji

Abstract:

The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.

Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis

Procedia PDF Downloads 326
1352 Hybrid Seismic Energy Dissipation Devices Made of Viscoelastic Pad and Steel Plate

Authors: Jinkoo Kim, Minsung Kim

Abstract:

This study develops a hybrid seismic energy dissipation device composed of a viscoelastic damper and a steel slit damper connected in parallel. A cyclic loading test is conducted on a test specimen to validate the seismic performance of the hybrid damper. Then a moment-framed model structure is designed without seismic load so that it is retrofitted with the hybrid dampers. The model structure is transformed into an equivalent simplified system to find out optimum story-wise damper distribution pattern using genetic algorithm. The effectiveness of the hybrid damper is investigated by fragility analysis and the life cycle cost evaluation of the structure with and without the dampers. The analysis results show that the model structure has reduced probability of reaching damage states, especially the complete damage state, after seismic retrofit. The expected damage cost and consequently the life cycle cost of the retrofitted structure turn out to be significantly small compared with those of the original structure. Acknowledgement: This research was supported by the Ministry of Trade, Industry and Energy (MOTIE) and Korea Institute for Advancement of Technology (KIAT) through the International Cooperative R & D program (N043100016).

Keywords: seismic retrofit, slit dampers, friction dampers, hybrid dampers

Procedia PDF Downloads 286
1351 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 77
1350 State Estimation of a Biotechnological Process Using Extended Kalman Filter and Particle Filter

Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte, V. Grincas

Abstract:

This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.

Keywords: biomass concentration, extended Kalman filter, particle filter, state estimation, specific growth rate

Procedia PDF Downloads 433
1349 Development of Trigger Tool to Identify Adverse Drug Events From Warfarin Administered to Patient Admitted in Medical Wards of Chumphae Hospital

Authors: Puntarikorn Rungrattanakasin

Abstract:

Objectives: To develop the trigger tool to warn about the risk of bleeding as an adverse event from warfarin drug usage during admission in Medical Wards of Chumphae Hospital. Methods: A retrospective study was performed by reviewing the medical records for the patients admitted between June 1st,2020- May 31st, 2021. ADEs were evaluated by Naranjo’s algorithm. The international normalized ratio (INR) and events of bleeding during admissions were collected. Statistical analyses, including Chi-square test and Reciever Operating Characteristic (ROC) curve for optimal INR threshold, were used for the study. Results: Among the 139 admissions, the INR range was found to vary between 0.86-14.91, there was a total of 15 bleeding events, out of which 9 were mild, and 6 were severe. The occurrence of bleeding started whenever the INR was greater than 2.5 and reached the statistical significance (p <0.05), which was in concordance with the ROC curve and yielded 100 % sensitivity and 60% specificity in the detection of a bleeding event. In this regard, the INR greater than 2.5 was considered to be an optimal threshold to alert promptly for bleeding tendency. Conclusions: The INR value of greater than 2.5 (>2.5) would be an appropriate trigger tool to warn of the risk of bleeding for patients taking warfarin in Chumphae Hospital.

Keywords: trigger tool, warfarin, risk of bleeding, medical wards

Procedia PDF Downloads 150
1348 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 100
1347 Effects of Non-Motorized Vehicles on a Selected Intersection in Dhaka City for Non Lane Based Heterogeneous Traffic Using VISSIM 5.3

Authors: A. C. Dey, H. M. Ahsan

Abstract:

Heterogeneous traffic composed of both motorized and non-motorized vehicles that are a common feature of urban Bangladeshi roads. Popular non-motorized vehicles include rickshaws, rickshaw-van, and bicycle. These modes performed an important role in moving people and goods in the absence of a dependable mass transport system. However, rickshaws play a major role in meeting the demand for door-to-door public transport services to the city dwellers. But there is no separate lane for non-motorized vehicles in this city. Non-motorized vehicles generally occupy the outermost or curb-side lanes, however, at intersections non-motorized vehicles get mixed with the motorized vehicles. That’s why the conventional models fail to analyze the situation completely. Microscopic traffic simulation software VISSIM 5.3, itself a lane base software but default behavioral parameters [such as driving behavior, lateral distances, overtaking tendency, CCO=0.4m, CC1=1.5s] are modified for calibrating a model to analyze the effects of non-motorized traffic at an intersection (Mirpur-10) in a non-lane based mixed traffic condition. It is seen from field data that NMV occupies an average 20% of the total number of vehicles almost all the link roads. Due to the large share of non-motorized vehicles, capacity significantly drop. After analyzing simulation raw data, significant variation is noticed. Such as the average vehicular speed is reduced by 25% and the number of vehicles decreased by 30% only for the presence of NMV. Also the variation of lateral occupancy and queue delay time increase by 2.37% and 33.75% respectively. Thus results clearly show the negative effects of non-motorized vehicles on capacity at an intersection. So special management technics or restriction of NMV at major intersections may be an effective solution to improve this existing critical condition.

Keywords: lateral occupancy, non lane based intersection, nmv, queue delay time, VISSIM 5.3

Procedia PDF Downloads 157
1346 Application of Deep Neural Networks to Assess Corporate Credit Rating

Authors: Parisa Golbayani, Dan Wang, Ionut¸ Florescu

Abstract:

In this work we implement machine learning techniques to financial statement reports in order to asses company’s credit rating. Specifically, the work analyzes the performance of four neural network architectures (MLP, CNN, CNN2D, LSTM) in predicting corporate credit rating as issued by Standard and Poor’s. The paper focuses on companies from the energy, financial, and healthcare sectors in the US. The goal of this analysis is to improve application of machine learning algorithms to credit assessment. To accomplish this, the study investigates three questions. First, we investigate if the algorithms perform better when using a selected subset of important features or whether better performance is obtained by allowing the algorithms to select features themselves. Second, we address the temporal aspect inherent in financial data and study whether it is important for the results obtained by a machine learning algorithm. Third, we aim to answer if one of the four particular neural network architectures considered consistently outperforms the others, and if so under which conditions. This work frames the problem as several case studies to answer these questions and analyze the results using ANOVA and multiple comparison testing procedures.

Keywords: convolutional neural network, long short term memory, multilayer perceptron, credit rating

Procedia PDF Downloads 240
1345 Gymnastics-Oriented Training Program: Impact of 6 weeks Training on the Fitness and Performance of Basketball Players

Authors: Syed Ibrahim, Syed Muneer Ahmed

Abstract:

It is a global phenomenon that fitness is a pre-requisite to the desired end of optimum efficiency in elite class basketballers achieved through appropriate conditioning program. This study was undertaken to find out the effect of gymnastic oriented training program on the physical fitness and the level of technical performance of basketball players. Method: 27 basketballers were divided into 12 experimental and 15 control groups aged between 19 to 25 years. Physical fitness tests comprising of vertical jump, push-ups, chin ups, sit ups, back strength, 30 m sprint, boomerangs test, 600 m run, sit and reach, bridge up and shoulder rotation and technical skill tests like dribbling, layup shots and rebound collection were used for the study. A pre- and post-test was conducted before and after the training program of 6 weeks. Results: The results indicated no significant difference in the anthropometric measurements of age, height and weight between the experimental and control group as the ‘t’ values observed were 0.28, 1.63 and 1.60 respectively . There were significant improvements in vertical jump, push-ups, sit-ups, modified boomerang test, bridge test and shoulder rotation index with the ‘t’ values being 2.60, 3.41, 3.91, 4.02, 3.55 and 2.33 respectively. However, no significant differences existed in chin-ups, back strength, 30 m sprint and 6000 m run with the ‘t’ values being 2.08, 1.77, 1.28 and 0.80 respectively. There was significant improvement in the post-test for the technical skills tests in the experimental group with ‘t’ values being 3.65, 2.57, and 3.62 for the dribble, layup shots and rebound collection respectively. There was no significant difference in the values of the control group except in the rebound collection which showed significant difference. Conclusion: It was found that both the physical fitness and skill proficiency of the basketballers increased through the participation in the gymnastics oriented program.

Keywords: gymnastic, technical, pre-requisite, elite class

Procedia PDF Downloads 404
1344 Study on Optimization Design of Pressure Hull for Underwater Vehicle

Authors: Qasim Idrees, Gao Liangtian, Liu Bo, Miao Yiran

Abstract:

In order to improve the efficiency and accuracy of the pressure hull structure, optimization of underwater vehicle based on response surface methodology, a method for optimizing the design of pressure hull structure was studied. To determine the pressure shell of five dimensions as a design variable, the application of thin shell theory and the Chinese Classification Society (CCS) specification was carried on the preliminary design. In order to optimize variables of the feasible region, different methods were studied and implemented such as Opt LHD method (to determine the design test sample points in the feasible domain space), parametric ABAQUS solution for each sample point response, and the two-order polynomial response for the surface model of the limit load of structures. Based on the ultimate load of the structure and the quality of the shell, the two-generation genetic algorithm was used to solve the response surface, and the Pareto optimal solution set was obtained. The final optimization result was 41.68% higher than that of the initial design, and the shell quality was reduced by about 27.26%. The parametric method can ensure the accuracy of the test and improve the efficiency of optimization.

Keywords: parameterization, response surface, structure optimization, pressure hull

Procedia PDF Downloads 237
1343 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 368
1342 An Advanced Approach to Detect and Enumerate Soil-Transmitted Helminth Ova from Wastewater

Authors: Vivek B. Ravindran, Aravind Surapaneni, Rebecca Traub, Sarvesh K. Soni, Andrew S. Ball

Abstract:

Parasitic diseases have a devastating, long-term impact on human health and welfare. More than two billion people are infected with soil-transmitted helminths (STHs), including the roundworms (Ascaris), hookworms (Necator and Ancylostoma) and whipworm (Trichuris) with majority occurring in the tropical and subtropical regions of the world. Despite its low prevalence in developed countries, the removal of STHs from wastewater remains crucial to allow the safe use of sludge or recycled water in agriculture. Conventional methods such as incubation and optical microscopy are cumbersome; consequently, the results drastically vary from person-to-person observing the ova (eggs) under microscope. Although PCR-based methods are an alternative to conventional techniques, it lacks the ability to distinguish between viable and non-viable helminth ova. As a result, wastewater treatment industries are in major need for radically new and innovative tools to detect and quantify STHs eggs with precision, accuracy and being cost-effective. In our study, we focus on the following novel and innovative techniques: -Recombinase polymerase amplification and Surface enhanced Raman spectroscopy (RPA-SERS) based detection of helminth ova. -Use of metal nanoparticles and their relative nanozyme activity. -Colorimetric detection, differentiation and enumeration of genera of helminth ova using hydrolytic enzymes (chitinase and lipase). -Propidium monoazide (PMA)-qPCR to detect viable helminth ova. -Modified assay to recover and enumerate helminth eggs from fresh raw sewage. -Transcriptome analysis of ascaris ova in fresh raw sewage. The aforementioned techniques have the potential to replace current conventional and molecular methods thereby producing a standard protocol for the determination and enumeration of helminth ova in sewage sludge.

Keywords: colorimetry, helminth, PMA-QPCR, nanoparticles, RPA, viable

Procedia PDF Downloads 300
1341 Proliferative Effect of Some Calcium Channel Blockers on the Human Embryonic Kidney Cell Line

Authors: Lukman Ahmad Jamil, Heather M. Wallace

Abstract:

Introduction: Numerous epidemiological studies have shown a positive as well as negative association and no association in some cases between chronic use of calcium channel blockers and the increased risk of developing cancer. However, these associations were enmeshed with controversies in the absence of laboratory based studies to back up those claims. Aim: The aim of this study was to determine in mechanistic terms the association between the long-term administration of nifedipine and diltiazem and increased risk of developing cancer using the human embryonic kidney (HEK293) cell line. Methods: Cell counting using the Trypan blue dye exclusion and 3-4, 5-Dimethylthiazol-2-yl-2, 5-diphenyl-tetrazolium bromide (MTT) assays were used to investigate the effect of nifedipine and diltiazem on the growth pattern of HEK293 cells. Protein assay using modified Lowry method and analysis of intracellular polyamines concentration using Liquid Chromatography – Tandem Mass Spectrometry (LC-MS) were performed to ascertain the mechanism through which chronic use of nifedipine increases the risk of developing cancer. Results: Both nifedipine and diltiazem significantly increased the proliferation of HEK293 cells dose and time dependently. This proliferative effect after 24, 48 and 72-hour incubation period was observed at 0.78, 1.56 and 25 µM for nifedipine and 0.39, 1.56 and 25 µM for diltiazem, respectively. The increased proliferation of the cells was found to be statistically significantly (p<0.05). Furthermore, the increased proliferation of the cells induced by nifedipine was associated with the increase in the protein content and elevated intracellular polyamines concentration level. Conclusion: The chronic use of nifedipine is associated with increased proliferation of cells with concomitant elevation of polyamines concentration and elevated polyamine levels have been implicated in many malignant transformations and hence, these provide a possible explanation on the link between long term use of nifedipine and development of some human cancers. Further studies are needed to evaluate the cause of this association.

Keywords: cancer, nifedipine, polyamine, proliferation

Procedia PDF Downloads 201