Search results for: coupling model
8656 Computational System for the Monitoring Ecosystem of the Endangered White Fish (Chirostoma estor estor) in the Patzcuaro Lake, Mexico
Authors: Cesar Augusto Hoil Rosas, José Luis Vázquez Burgos, José Juan Carbajal Hernandez
Abstract:
White fish (Chirostoma estor estor) is an endemic species that habits in the Patzcuaro Lake, located in Michoacan, Mexico; being an important source of gastronomic and cultural wealth of the area. Actually, it have undergone an immense depopulation of individuals, due to the high fishing, contamination and eutrophication of the lake water, resulting in the possible extinction of this important species. This work proposes a new computational model for monitoring and assessment of critical environmental parameters of the white fish ecosystem. According to an Analytical Hierarchy Process, a mathematical model is built assigning weights to each environmental parameter depending on their water quality importance on the ecosystem. Then, a development of an advanced system for the monitoring, analysis and control of water quality is built using the virtual environment of LabVIEW. As results, we have obtained a global score that indicates the condition level of the water quality in the Chirostoma estor ecosystem (excellent, good, regular and poor), allowing to provide an effective decision making about the environmental parameters that affect the proper culture of the white fish such as temperature, pH and dissolved oxygen. In situ evaluations show regular conditions for a success reproduction and growth rates of this species where the water quality tends to have regular levels. This system emerges as a suitable tool for the water management, where future laws for white fish fishery regulations will result in the reduction of the mortality rate in the early stages of development of the species, which represent the most critical phase. This can guarantees better population sizes than those currently obtained in the aquiculture crop. The main benefit will be seen as a contribution to maintain the cultural and gastronomic wealth of the area and for its inhabitants, since white fish is an important food and economical income of the region, but the species is endangered.Keywords: Chirostoma estor estor, computational system, lab view, white fish
Procedia PDF Downloads 3258655 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK
Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick
Abstract:
The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest
Procedia PDF Downloads 1218654 Searching the Efficient Frontier for the Coherent Covering Location Problem
Authors: Felipe Azocar Simonet, Luis Acosta Espejo
Abstract:
In this article, we will try to find an efficient boundary approximation for the bi-objective location problem with coherent coverage for two levels of hierarchy (CCLP). We present the mathematical formulation of the model used. Supported efficient solutions and unsupported efficient solutions are obtained by solving the bi-objective combinatorial problem through the weights method using a Lagrangean heuristic. Subsequently, the results are validated through the DEA analysis with the GEM index (Global efficiency measurement).Keywords: coherent covering location problem, efficient frontier, lagragian relaxation, data envelopment analysis
Procedia PDF Downloads 3338653 Exoskeleton Response During Infant Physiological Knee Kinematics And Dynamics
Authors: Breanna Macumber, Victor A. Huayamave, Emir A. Vela, Wangdo Kim, Tamara T. Chamber, Esteban Centeno
Abstract:
Spina bifida is a type of neural tube defect that affects the nervous system and can lead to problems such as total leg paralysis. Treatment requires physical therapy and rehabilitation. Robotic exoskeletons have been used for rehabilitation to train muscle movement and assist in injury recovery; however, current models focus on the adult populations and not on the infant population. The proposed framework aims to couple a musculoskeletal infant model with a robotic exoskeleton using vacuum-powered artificial muscles to provide rehabilitation to infants affected by spina bifida. The study that drove the input values for the robotic exoskeleton used motion capture technology to collect data from the spontaneous kicking movement of a 2.4-month-old infant lying supine. OpenSim was used to develop the musculoskeletal model, and Inverse kinematics was used to estimate hip joint angles. A total of 4 kicks (A, B, C, D) were selected, and the selection was based on range, transient response, and stable response. Kicks had at least 5° of range of motion with a smooth transient response and a stable period. The robotic exoskeleton used a Vacuum-Powered Artificial Muscle (VPAM) the structure comprised of cells that were clipped in a collapsed state and unclipped when desired to simulate infant’s age. The artificial muscle works with vacuum pressure. When air is removed, the muscle contracts and when air is added, the muscle relaxes. Bench testing was performed using a 6-month-old infant mannequin. The previously developed exoskeleton worked really well with controlled ranges of motion and frequencies, which are typical of rehabilitation protocols for infants suffering with spina bifida. However, the random kicking motion in this study contained high frequency kicks and was not able to accurately replicate all the investigated kicks. Kick 'A' had a greater error when compared to the other kicks. This study has the potential to advance the infant rehabilitation field.Keywords: musculoskeletal modeling, soft robotics, rehabilitation, pediatrics
Procedia PDF Downloads 838652 Simulating the Interaction of Strategy Development and Project Delivery
Authors: Nipun Agarwal, David Paul, Fareed Un Din
Abstract:
Every organization develops a strategy that needs to be implemented and is undertaken through project delivery. In essence, project requirements should exactly replicate an organization’s strategy. In reality this does not happen, and behavioral factors deviate the project delivery from the strategic objectives. This occurs as project stakeholders can have competing objectives. Resultantly, requirements that are implemented through projects are less aligned to the strategy. This paper develops a game theoretic model to simulate why such deviations occur. That explains the difference between strategy development and implementation.Keywords: strategy, simulation, project management, game theory
Procedia PDF Downloads 1388651 Generative Design of Acoustical Diffuser and Absorber Elements Using Large-Scale Additive Manufacturing
Authors: Saqib Aziz, Brad Alexander, Christoph Gengnagel, Stefan Weinzierl
Abstract:
This paper explores a generative design, simulation, and optimization workflow for the integration of acoustical diffuser and/or absorber geometry with embedded coupled Helmholtz-resonators for full-scale 3D printed building components. Large-scale additive manufacturing in conjunction with algorithmic CAD design tools enables a vast amount of control when creating geometry. This is advantageous regarding the increasing demands of comfort standards for indoor spaces and the use of more resourceful and sustainable construction methods and materials. The presented methodology highlights these new technological advancements and offers a multimodal and integrative design solution with the potential for an immediate application in the AEC-Industry. In principle, the methodology can be applied to a wide range of structural elements that can be manufactured by additive manufacturing processes. The current paper focuses on a case study of an application for a biaxial load-bearing beam grillage made of reinforced concrete, which allows for a variety of applications through the combination of additive prefabricated semi-finished parts and in-situ concrete supplementation. The semi-prefabricated parts or formwork bodies form the basic framework of the supporting structure and at the same time have acoustic absorption and diffusion properties that are precisely acoustically programmed for the space underneath the structure. To this end, a hybrid validation strategy is being explored using a digital and cross-platform simulation environment, verified with physical prototyping. The iterative workflow starts with the generation of a parametric design model for the acoustical geometry using the algorithmic visual scripting editor Grasshopper3D inside the building information modeling (BIM) software Revit. Various geometric attributes (i.e., bottleneck and cavity dimensions) of the resonator are parameterized and fed to a numerical optimization algorithm which can modify the geometry with the goal of increasing absorption at resonance and increasing the bandwidth of the effective absorption range. Using Rhino.Inside and LiveLink for Revit, the generative model was imported directly into the Multiphysics simulation environment COMSOL. The geometry was further modified and prepared for simulation in a semi-automated process. The incident and scattered pressure fields were simulated from which the surface normal absorption coefficients were calculated. This reciprocal process was repeated to further optimize the geometric parameters. Subsequently the numerical models were compared to a set of 3D concrete printed physical twin models, which were tested in a .25 m x .25 m impedance tube. The empirical results served to improve the starting parameter settings of the initial numerical model. The geometry resulting from the numerical optimization was finally returned to grasshopper for further implementation in an interdisciplinary study.Keywords: acoustical design, additive manufacturing, computational design, multimodal optimization
Procedia PDF Downloads 1578650 A Semi-supervised Classification Approach for Trend Following Investment Strategy
Authors: Rodrigo Arnaldo Scarpel
Abstract:
Trend following is a widely accepted investment strategy that adopts a rule-based trading mechanism that rather than striving to predict market direction or on information gathering to decide when to buy and when to sell a stock. Thus, in trend following one must respond to market’s movements that has recently happen and what is currently happening, rather than on what will happen. Optimally, in trend following strategy, is to catch a bull market at its early stage, ride the trend, and liquidate the position at the first evidence of the subsequent bear market. For applying the trend following strategy one needs to find the trend and identify trade signals. In order to avoid false signals, i.e., identify fluctuations of short, mid and long terms and to separate noise from real changes in the trend, most academic works rely on moving averages and other technical analysis indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI) to uncover intelligible stock trading rules following trend following strategy philosophy. Recently, some works has applied machine learning techniques for trade rules discovery. In those works, the process of rule construction is based on evolutionary learning which aims to adapt the rules to the current environment and searches for the global optimum rules in the search space. In this work, instead of focusing on the usage of machine learning techniques for creating trading rules, a time series trend classification employing a semi-supervised approach was used to early identify both the beginning and the end of upward and downward trends. Such classification model can be employed to identify trade signals and the decision-making procedure is that if an up-trend (down-trend) is identified, a buy (sell) signal is generated. Semi-supervised learning is used for model training when only part of the data is labeled and Semi-supervised classification aims to train a classifier from both the labeled and unlabeled data, such that it is better than the supervised classifier trained only on the labeled data. For illustrating the proposed approach, it was employed daily trade information, including the open, high, low and closing values and volume from January 1, 2000 to December 31, 2022, of the São Paulo Exchange Composite index (IBOVESPA). Through this time period it was visually identified consistent changes in price, upwards or downwards, for assigning labels and leaving the rest of the days (when there is not a consistent change in price) unlabeled. For training the classification model, a pseudo-label semi-supervised learning strategy was used employing different technical analysis indicators. In this learning strategy, the core is to use unlabeled data to generate a pseudo-label for supervised training. For evaluating the achieved results, it was considered the annualized return and excess return, the Sortino and the Sharpe indicators. Through the evaluated time period, the obtained results were very consistent and can be considered promising for generating the intended trading signals.Keywords: evolutionary learning, semi-supervised classification, time series data, trading signals generation
Procedia PDF Downloads 898649 Life Cycle Assessment of Biogas Energy Production from a Small-Scale Wastewater Treatment Plant in Central Mexico
Authors: Joel Bonales, Venecia Solorzano, Carlos Garcia
Abstract:
A great percentage of the wastewater generated in developing countries don’t receive any treatment, which leads to numerous environmental impacts. In response to this, a paradigm change in the current wastewater treatment model based on large scale plants towards a small and medium scale based model has been proposed. Nevertheless, small scale wastewater treatment (SS-WTTP) with novel technologies such as anaerobic digesters, as well as the utilization of derivative co-products such as biogas, still presents diverse environmental impacts which must be assessed. This study consisted in a Life Cycle Assessment (LCA) performed to a SS-WWTP which treats wastewater from a small commercial block in the city of Morelia, Mexico. The treatment performed in the SS-WWTP consists in anaerobic and aerobic digesters with a daily capacity of 5,040 L. Two different scenarios were analyzed: the current plant conditions and a hypothetical energy use of biogas obtained in situ. Furthermore, two different allocation criteria were applied: full impact allocation to the system’s main product (treated water) and substitution credits for replacing Mexican grid electricity (biogas) and clean water pumping (treated water). The results showed that the analyzed plant had bigger impacts than what has been reported in the bibliography in the basis of wastewater volume treated, which may imply that this plant is currently operating inefficiently. The evaluated impacts appeared to be focused in the aerobic digestion and electric generation phases due to the plant’s particular configuration. Additional findings prove that the allocation criteria applied is crucial for the interpretation of impacts and that that the energy use of the biogas obtained in this plant can help mitigate associated climate change impacts. It is concluded that SS-WTTP is a environmentally sound alternative for wastewater treatment from a systemic perspective. However, this type of studies must be careful in the selection of the allocation criteria and replaced products, since these factors have a great influence in the results of the assessment.Keywords: biogas, life cycle assessment, small scale treatment, wastewater treatment
Procedia PDF Downloads 1248648 Talent Management in Small and Medium Sized Companies: A Multilevel Approach Contextualized in France
Authors: Kousay Abid
Abstract:
The aim of this paper is to better understand talent and talent management (TM) in small French companies as well as in medium-sized ones (SME). While previous empirical investigations have largely focused on multinationals and big companies and concentrated on the Anglo-Saxon context, we focus on the pressing need for implementing TM strategies and practices, not only on a new ground of SME but also within a new European context related to France and the French context. This study also aims at understanding strategies adopted by those firms as means to attract, retain, maintain and to develop talents. We contribute to TM issues by adopting a multilevel approach, holding the goal of reaching a global holistic vision of interactions between various levels while applying TM, to make it more and more familiar to us. A qualitative research methodology based on a multiple-case study design, bottomed firstly on a qualitative survey and secondly on two in-depth case study, both built on interviews, will be used in order to develop an ideal analysis for TM strategies and practices. The findings will be based on data collected from more than 15 French SMEs. Our theoretical contributions are the fruit of context considerations and the dynamic of multilevel approach. Theoretically, we attempt first to clarify how talents and TM are seen and defined in French SMEs and consequently to enrich the literature on TM in SMEs out of the Anglo-Saxon context. Moreover, we seek to understand how SMEs manage jointly their talents and their TM strategies by setting up this contextualized pilot study. As well, we focus on the systematic TM model issue from French SMEs. Our prior managerial goal is to shed light on the need for TM to achieve a better management of these organizations by directing leaders to better identify the talented people whom they hold at all levels. In addition, our TM systematic model strengthens our analysis grid as recommendations for CEO and Human Resource Development (HRD) to make them rethink about the companies’ HR business strategies. Therefore, our outputs present a multiple lever of action that should be taken into consideration while reviewing HR strategies and systems, as well as their impact beyond organizational boundaries.Keywords: french context, multilevel approach, small and medium-sized enterprises, talent management
Procedia PDF Downloads 1808647 Experience of the Formation of Professional Competence of Students of IT-Specialties
Authors: B. I. Zhumagaliyev, L. Sh. Balgabayeva, G. S. Nabiyeva, B. A. Tulegenova, P. Oralkhan, B. S. Kalenova, S. S. Akhmetov
Abstract:
The article describes an approach to build competence in research of Bachelor and Master, which is now an important feature of modern specialist in the field of engineering. Provides an example of methodical teaching methods with the research aspect, is including the formulation of the problem, the method of conducting experiments, analysis of the results. Implementation of methods allows the student to better consolidate their knowledge and skills at the same time to get research. Knowledge on the part of the media requires some training in the subject area and teaching methods.Keywords: professional competence, model of it-specialties, teaching methods, educational technology, decision making
Procedia PDF Downloads 4368646 Next Generation Radiation Risk Assessment and Prediction Tools Generation Applying AI-Machine (Deep) Learning Algorithms
Authors: Selim M. Khan
Abstract:
Indoor air quality is strongly influenced by the presence of radioactive radon (222Rn) gas. Indeed, exposure to high 222Rn concentrations is unequivocally linked to DNA damage and lung cancer and is a worsening issue in North American and European built environments, having increased over time within newer housing stocks as a function of as yet unclear variables. Indoor air radon concentration can be influenced by a wide range of environmental, structural, and behavioral factors. As some of these factors are quantitative while others are qualitative, no single statistical model can determine indoor radon level precisely while simultaneously considering all these variables across a complex and highly diverse dataset. The ability of AI- machine (deep) learning to simultaneously analyze multiple quantitative and qualitative features makes it suitable to predict radon with a high degree of precision. Using Canadian and Swedish long-term indoor air radon exposure data, we are using artificial deep neural network models with random weights and polynomial statistical models in MATLAB to assess and predict radon health risk to human as a function of geospatial, human behavioral, and built environmental metrics. Our initial artificial neural network with random weights model run by sigmoid activation tested different combinations of variables and showed the highest prediction accuracy (>96%) within the reasonable iterations. Here, we present details of these emerging methods and discuss strengths and weaknesses compared to the traditional artificial neural network and statistical methods commonly used to predict indoor air quality in different countries. We propose an artificial deep neural network with random weights as a highly effective method for assessing and predicting indoor radon.Keywords: radon, radiation protection, lung cancer, aI-machine deep learnng, risk assessment, risk prediction, Europe, North America
Procedia PDF Downloads 968645 Activation of NLRP3 Inflammasomes by Helicobacter pylori Infection in Innate Cellular Model and Its Correlation to IL-1β Production
Authors: Islam Nowisser, Noha Farag, Mohamed El Azizi
Abstract:
Helicobacter pylori is a highly important human pathogen which inhabits about 50% of the population worldwide. Infection with this bacteria is very hard to treat, with high probability of recurrence. H. pylori causes severe gastric diseases, including peptic ulcer, gastritis, and gastric cancer, which has been linked to chronic inflammation. The infection has been reported to be associated with high levels of pro-inflammatory cytokines, especially IL-1β and TNF-α. The aim of the current study is to investigate the molecular mechanisms by which H. pylori activates NLRP3 inflammasome and its contribution to Il-1 β production in an innate cellular model. H. pylori PMSS1 and G27 standard strains, as well as the PMSS1 isogenic mutant strain PMSS1ΔVacA and G27ΔVacA, G27ΔCagA in addition to clinical isolates obtained from biopsy samples from the antrum and corpus mucosa of chronic gastritis patients, were used to establish infection in RAW-264.7 macrophages. The production levels of TNF-α and IL-1β was assessed using ELISA. Since expression of these cytokines is often regulated by the transcription factor complex, nuclear factor-kB (NF-kB), the activation of NF-κB in H. pylori infected cells was also evaluated by luciferase assay. Genomic DNA was extracted from bacterial cultures of H. pylori clinical isolates as well as the standard strains and their corresponding mutants, where they were evaluated for the cagA pathogenicity island and vacA expression. The correlation between these findings and expression of the cagA Pathogenicity Island and vacA in the bacteria was also investigated. The results showed IL-1β, and TNF-α production significantly increased in raw macrophages following H. pylori infection. The cagA+ and vacA+ H. pylori strains induced significant production of IL-1β compared to cagA- and vacA- strains. The activation pattern of NF-κB was correlated in the isolates to their cagA and vacA expression profiles. A similar finding could not be confirmed for TNF-α production. Our study shows the ability of H. pylori to activate NF-kB and induce significant IL-1β production as a possible mechanism for the augmented inflammatory response seen in subjects infected with cagA+ and vacA+ H. pylori strains that would lead to the progression to more severe form of the disease.Keywords: Helicobacter pylori, IL-1β, inflammatory cytokines, nuclear factor KB, TNF-α
Procedia PDF Downloads 1278644 Sustainable Hydrogel Nanocomposites Based on Grafted Chitosan and Clay for Effective Adsorption of Cationic Dye
Authors: H. Ferfera-Harrar, T. Benhalima, D. Lerari
Abstract:
Contamination of water, due to the discharge of untreated industrial wastewaters into the ecosystem, has become a serious problem for many countries. In this study, bioadsorbents based on chitosan-g-poly(acrylamide) and montmorillonite (MMt) clay (CTS-g-PAAm/MMt) hydrogel nanocomposites were prepared via free‐radical grafting copolymerization and crosslinking of acrylamide monomer (AAm) onto natural polysaccharide chitosan (CTS) as backbone, in presence of various contents of MMt clay as nanofiller. Then, they were hydrolyzed to obtain highly functionalized pH‐sensitive nanomaterials with uppermost swelling properties. Their structure characterization was conducted by X-Ray Diffraction (XRD) and Scanning Electron Microscopy (SEM) analyses. The adsorption performances of the developed nanohybrids were examined for removal of methylene blue (MB) cationic dye from aqueous solutions. The factors affecting the removal of MB, such as clay content, pH medium, adsorbent dose, initial dye concentration and temperature were explored. The adsorption process was found to be highly pH dependent. From adsorption kinetic results, the prepared adsorbents showed remarkable adsorption capacity and fast adsorption rate, mainly more than 88% of MB removal efficiency was reached after 50 min in 200 mg L-1 of dye solution. In addition, the incorporating of various content of clay has enhanced adsorption capacity of CTS-g-PAAm matrix from 1685 to a highest value of 1749 mg g-1 for the optimized nanocomposite containing 2 wt.% of MMt. The experimental kinetic data were well described by the pseudo-second-order model, while the equilibrium data were represented perfectly by Langmuir isotherm model. The maximum Langmuir equilibrium adsorption capacity (qm) was found to increase from 2173 mg g−1 until 2221 mg g−1 by adding 2 wt.% of clay nanofiller. Thermodynamic parameters revealed the spontaneous and endothermic nature of the process. In addition, the reusability study revealed that these bioadsorbents could be well regenerated with desorption efficiency overhead 87% and without any obvious decrease of removal efficiency as compared to starting ones even after four consecutive adsorption/desorption cycles, which exceeded 64%. These results suggest that the optimized nanocomposites are promising as low cost bioadsorbents.Keywords: chitosan, clay, dye adsorption, hydrogels nanocomposites
Procedia PDF Downloads 1228643 Characterization of the in 0.53 Ga 0.47 as n+nn+ Photodetectors
Authors: Fatima Zohra Mahi, Luca Varani
Abstract:
We present an analytical model for the calculation of the sensitivity, the spectral current noise and the detectivity for an optically illuminated In0.53Ga0.47As n+nn+ diode. The photocurrent due to the excess carrier is obtained by solving the continuity equation. Moreover, the current noise level is evaluated at room temperature and under a constant voltage applied between the diode terminals. The analytical calculation of the current noise in the n+nn+ structure is developed. The responsivity and the detectivity are discussed as functions of the doping concentrations and the emitter layer thickness in one-dimensional homogeneous n+nn+ structure.Keywords: detectivity, photodetectors, continuity equation, current noise
Procedia PDF Downloads 6438642 Analysis of the Inverse Kinematics for 5 DOF Robot Arm Using D-H Parameters
Authors: Apurva Patil, Maithilee Kulkarni, Ashay Aswale
Abstract:
This paper proposes an algorithm to develop the kinematic model of a 5 DOF robot arm. The formulation of the problem is based on finding the D-H parameters of the arm. Brute Force iterative method is employed to solve the system of non linear equations. The focus of the paper is to obtain the accurate solutions by reducing the root mean square error. The result obtained will be implemented to grip the objects. The trajectories followed by the end effector for the required workspace coordinates are plotted. The methodology used here can be used in solving the problem for any other kinematic chain of up to six DOF.Keywords: 5 DOF robot arm, D-H parameters, inverse kinematics, iterative method, trajectories
Procedia PDF Downloads 2028641 Modeling the Three - Echelon Repairable Parts Inventory System under (S-1, S) Policy
Authors: Rohit Kapoor
Abstract:
In this paper, an attempt is made to formulate 3-echelon repairable parts inventory system under (S-1, S) policy. This analytical model is the extension of an exact formulation of two - echelon repairable parts inventory system, already reported in the established literature. In the present paper, we try to formulate the total cost expression consisting of two components, viz., system investment cost and expected backorder cost.Keywords: (S-1, S) inventory policy, multi-echelon inventory system, repairable parts
Procedia PDF Downloads 5388640 Observationally Constrained Estimates of Aerosol Indirect Radiative Forcing over Indian Ocean
Authors: Sofiya Rao, Sagnik Dey
Abstract:
Aerosol-cloud-precipitation interaction continues to be one of the largest sources of uncertainty in quantifying the aerosol climate forcing. The uncertainty is increasing from global to regional scale. This problem remains unresolved due to the large discrepancy in the representation of cloud processes in the climate models. Most of the studies on aerosol-cloud-climate interaction and aerosol-cloud-precipitation over Indian Ocean (like INDOEX, CAIPEEX campaign etc.) are restricted to either particular to one season or particular to one region. Here we developed a theoretical framework to quantify aerosol indirect radiative forcing using Moderate Resolution Imaging Spectroradiometer (MODIS) aerosol and cloud products of 15 years (2000-2015) period over the Indian Ocean. This framework relies on the observationally constrained estimate of the aerosol-induced change in cloud albedo. We partitioned the change in cloud albedo into the change in Liquid Water Path (LWP) and Effective Radius of Clouds (Reff) in response to an aerosol optical depth (AOD). Cloud albedo response to an increase in AOD is most sensitive in the range of LWP between 120-300 gm/m² for a range of Reff varying from 8-24 micrometer, which means aerosols are most sensitive to this range of LWP and Reff. Using this framework, aerosol forcing during a transition from indirect to semi-direct effect is also calculated. The outcome of this analysis shows best results over the Arabian Sea in comparison with the Bay of Bengal and the South Indian Ocean because of heterogeneity in aerosol spices over the Arabian Sea. Over the Arabian Sea during Winter Season the more absorbing aerosols are dominating, during Pre-monsoon dust (coarse mode aerosol particles) are more dominating. In winter and pre-monsoon majorly the aerosol forcing is more dominating while during monsoon and post-monsoon season meteorological forcing is more dominating. Over the South Indian Ocean, more or less same types of aerosol (Sea salt) are present. Over the Arabian Sea the Aerosol Indirect Radiative forcing are varying from -5 ± 4.5 W/m² for winter season while in other seasons it is reducing. The results provide observationally constrained estimates of aerosol indirect forcing in the Indian Ocean which can be helpful in evaluating the climate model performance in the context of such complex interactions.Keywords: aerosol-cloud-precipitation interaction, aerosol-cloud-climate interaction, indirect radiative forcing, climate model
Procedia PDF Downloads 1758639 A Strategy Therapy for Retinitis Pigmentosa Induced by Argon Laser in Rabbits by High Dose Adult Stem Cells
Authors: Hager E. Amer, Hany El Saftawy, Laila Rashed, Ahmed M. Ata, Fatma Metwally, Hesham Mettawei, Hossam E. Sayed, Tamer Adel, Kareem M. El Sawah
Abstract:
Aim: The purpose of this study is to regenerate the damaged photoreceptor cells as a result of argon laser as a model of Retinitis Pigmentosa in rabbits' retina by using adult stem cells from rabbits' bone marrow. Background: Retinitis pigmentosa (RP) is a group of inherited disorders that primarily affect photoreceptor and pigment epithelium function. RP leads to loss of the rod outer segment and shorten the photoreceptor layer and expose the photoreceptor cell body to high-pressure levels in oxygen (oxidative stress) leads to apoptosis to the rod and cone cells. In particular, there is no specific treatment for retinitis pigmentosa. Materials and Methods: Forty Two Giant (Rex) rabbits were used in this experiment divided into 3 groups: Group 1: Control (6 rabbits), Group 2: Argon laser radiated as a model of retinitis pigmentosa (12 rabbits), Group 3: Laser radiated and treated by 6 million stem cells (12 rabbits). The last two groups are divided each into two subgroups each subgroup contains 6 rabbits, the ophthalmological examination was performed on rabbits, blood samples and retina samples were taken after 25 days and after 36 days from the laser radiation (10 days and 21 days after stem cells insertion in group 3) to perform the biochemical analysis. Results: Compared to control Group, a decrease of ERG wave amplitude and antioxidant substances (Glutathione) in blood and retina in group 2, and an increase of oxidative stress substances (Nitric oxide, Malonaldehyde, and carponyl protein) and apoptotic substances (Advanced glycation end product and M-metalloproteinase) in blood and retina. Compared to group 2, mostly increases of antioxidant substances and ERG wave amplitude in group 3, and mostly decreases in oxidative stress substances and apoptotic substances. Conclusion: Insertion of 6 million stem cells intravitreous gives good results in regeneration of the damaged photoreceptor cells after 21 days.Keywords: retinitis pigmentosa, stem cells, argon laser, oxidative stress, apoptosis
Procedia PDF Downloads 1988638 Nonlinear Observer Canonical Form for Genetic Regulation Process
Authors: Bououden Soraya
Abstract:
This paper aims to study the existence of the change of coordinates which permits to transform a class of nonlinear dynamical systems into the so-called nonlinear observer canonical form (NOCF). Moreover, an algorithm to construct such a change of coordinates is given. Based on this form, we can design an observer with a linear error dynamic. This enables us to estimate the state of a nonlinear dynamical system. A concrete example (biological model) is provided to illustrate the feasibility of the proposed results.Keywords: nonlinear observer canonical form, observer, design, gene regulation, gene expression
Procedia PDF Downloads 4328637 The Psychometric Properties of an Instrument to Estimate Performance in Ball Tasks Objectively
Authors: Kougioumtzis Konstantin, Rylander Pär, Karlsteen Magnus
Abstract:
Ball skills as a subset of fundamental motor skills are predictors for performance in sports. Currently, most tools evaluate ball skills utilizing subjective ratings. The aim of this study was to examine the psychometric properties of a newly developed instrument to objectively measure ball handling skills (BHS-test) utilizing digital instrument. Participants were a convenience sample of 213 adolescents (age M = 17.1 years, SD =3.6; 55% females, 45% males) recruited from upper secondary schools and invited to a sports hall for the assessment. The 8-item instrument incorporated both accuracy-based ball skill tests and repetitive-performance tests with a ball. Testers counted performance manually in the four tests (one throwing and three juggling tasks). Furthermore, assessment was technologically enhanced in the other four tests utilizing a ball machine, a Kinect camera and balls with motion sensors (one balancing and three rolling tasks). 3D printing technology was used to construct equipment, while all results were administered digitally with smart phones/tablets, computers and a specially constructed application to send data to a server. The instrument was deemed reliable (α = .77) and principal component analysis was used in a random subset (53 of the participants). Furthermore, latent variable modeling was employed to confirm the structure with the remaining subset (160 of the participants). The analysis showed good factorial-related validity with one factor explaining 57.90 % of the total variance. Four loadings were larger than .80, two more exceeded .76 and the other two were .65 and .49. The one factor solution was confirmed by a first order model with one general factor and an excellent fit between model and data (χ² = 16.12, DF = 20; RMSEA = .00, CI90 .00–.05; CFI = 1.00; SRMR = .02). The loadings on the general factor ranged between .65 and .83. Our findings indicate good reliability and construct validity for the BHS-test. To develop the instrument further, more studies are needed with various age-groups, e.g. children. We suggest using the BHS-test for diagnostic or assessment purpose for talent development and sports participation interventions that focus on ball games.Keywords: ball-handling skills, ball-handling ability, technologically-enhanced measurements, assessment
Procedia PDF Downloads 948636 Lean Comic GAN (LC-GAN): a Light-Weight GAN Architecture Leveraging Factorized Convolution and Teacher Forcing Distillation Style Loss Aimed to Capture Two Dimensional Animated Filtered Still Shots Using Mobile Phone Camera and Edge Devices
Authors: Kaustav Mukherjee
Abstract:
In this paper we propose a Neural Style Transfer solution whereby we have created a Lightweight Separable Convolution Kernel Based GAN Architecture (SC-GAN) which will very useful for designing filter for Mobile Phone Cameras and also Edge Devices which will convert any image to its 2D ANIMATED COMIC STYLE Movies like HEMAN, SUPERMAN, JUNGLE-BOOK. This will help the 2D animation artist by relieving to create new characters from real life person's images without having to go for endless hours of manual labour drawing each and every pose of a cartoon. It can even be used to create scenes from real life images.This will reduce a huge amount of turn around time to make 2D animated movies and decrease cost in terms of manpower and time. In addition to that being extreme light-weight it can be used as camera filters capable of taking Comic Style Shots using mobile phone camera or edge device cameras like Raspberry Pi 4,NVIDIA Jetson NANO etc. Existing Methods like CartoonGAN with the model size close to 170 MB is too heavy weight for mobile phones and edge devices due to their scarcity in resources. Compared to the current state of the art our proposed method which has a total model size of 31 MB which clearly makes it ideal and ultra-efficient for designing of camera filters on low resource devices like mobile phones, tablets and edge devices running OS or RTOS. .Owing to use of high resolution input and usage of bigger convolution kernel size it produces richer resolution Comic-Style Pictures implementation with 6 times lesser number of parameters and with just 25 extra epoch trained on a dataset of less than 1000 which breaks the myth that all GAN need mammoth amount of data. Our network reduces the density of the Gan architecture by using Depthwise Separable Convolution which does the convolution operation on each of the RGB channels separately then we use a Point-Wise Convolution to bring back the network into required channel number using 1 by 1 kernel.This reduces the number of parameters substantially and makes it extreme light-weight and suitable for mobile phones and edge devices. The architecture mentioned in the present paper make use of Parameterised Batch Normalization Goodfellow etc al. (Deep Learning OPTIMIZATION FOR TRAINING DEEP MODELS page 320) which makes the network to use the advantage of Batch Norm for easier training while maintaining the non-linear feature capture by inducing the learnable parametersKeywords: comic stylisation from camera image using GAN, creating 2D animated movie style custom stickers from images, depth-wise separable convolutional neural network for light-weight GAN architecture for EDGE devices, GAN architecture for 2D animated cartoonizing neural style, neural style transfer for edge, model distilation, perceptual loss
Procedia PDF Downloads 1328635 Algorithm for Predicting Cognitive Exertion and Cognitive Fatigue Using a Portable EEG Headset for Concussion Rehabilitation
Authors: Lou J. Pino, Mark Campbell, Matthew J. Kennedy, Ashleigh C. Kennedy
Abstract:
A concussion is complex and nuanced, with cognitive rest being a key component of recovery. Cognitive overexertion during rehabilitation from a concussion is associated with delayed recovery. However, daily living imposes cognitive demands that may be unavoidable and difficult to quantify. Therefore, a portable tool capable of alerting patients before cognitive overexertion occurs could allow patients to maintain their quality of life while preventing symptoms and recovery setbacks. EEG allows for a sensitive measure of cognitive exertion. Clinical 32-lead EEG headsets are not practical for day-to-day concussion rehabilitation management. However, there are now commercially available and affordable portable EEG headsets. Thus, these headsets can potentially be used to continuously monitor cognitive exertion during mental tasks to alert the wearer of overexertion, with the aim of preventing the occurrence of symptoms to speed recovery times. The objective of this study was to test an algorithm for predicting cognitive exertion from EEG data collected from a portable headset. EEG data were acquired from 10 participants (5 males, 5 females). Each participant wore a portable 4 channel EEG headband while completing 10 tasks: rest (eyes closed), rest (eyes open), three levels of the increasing difficulty of logic puzzles, three levels of increasing difficulty in multiplication questions, rest (eyes open), and rest (eyes closed). After each task, the participant was asked to report their perceived level of cognitive exertion using the NASA Task Load Index (TLX). Each participant then completed a second session on a different day. A customized machine learning model was created using data from the first session. The performance of each model was then tested using data from the second session. The mean correlation coefficient between TLX scores and predicted cognitive exertion was 0.75 ± 0.16. The results support the efficacy of the algorithm for predicting cognitive exertion. This demonstrates that the algorithms developed in this study used with portable EEG devices have the potential to aid in the concussion recovery process by monitoring and warning patients of cognitive overexertion. Preventing cognitive overexertion during recovery may reduce the number of symptoms a patient experiences and may help speed the recovery process.Keywords: cognitive activity, EEG, machine learning, personalized recovery
Procedia PDF Downloads 2208634 Generalized Chaplygin Gas and Varying Bulk Viscosity in Lyra Geometry
Authors: A. K. Sethi, R. N. Patra, B. Nayak
Abstract:
In this paper, we have considered Friedmann-Robertson-Walker (FRW) metric with generalized Chaplygin gas which has viscosity in the context of Lyra geometry. The viscosity is considered in two different ways (i.e. zero viscosity, non-constant r (rho)-dependent bulk viscosity) using constant deceleration parameter which concluded that, for a special case, the viscous generalized Chaplygin gas reduces to modified Chaplygin gas. The represented model indicates on the presence of Chaplygin gas in the Universe. Observational constraints are applied and discussed on the physical and geometrical nature of the Universe.Keywords: bulk viscosity, lyra geometry, generalized chaplygin gas, cosmology
Procedia PDF Downloads 1758633 Young Adult Gay Men's Healthcare Access in the Era of the Affordable Care Act
Authors: Marybec Griffin
Abstract:
Purpose: The purpose of this cross-sectional study was to get a better understanding of healthcare usage and satisfaction among young adult gay men (YAGM), including the facility used as the usual source of healthcare, preference for coordinated healthcare, and if their primary care provider (PCP) adequately addressed the health needs of gay men. Methods: Interviews were conducted among n=800 YAGM in New York City (NYC). Participants were surveyed about their sociodemographic characteristics and healthcare usage and satisfaction access using multivariable logistic regression models. The surveys were conducted between November 2015 and June 2016. Results: The mean age of the sample was 24.22 years old (SD=4.26). The racial and ethnic background of the participants is as follows: 35.8% (n=286) Black Non-Hispanic, 31.9% (n=225) Hispanic/Latino, 20.5% (n=164) White Non-Hispanic, 4.4% (n=35) Asian/Pacific Islander, and 6.9% (n=55) reporting some other racial or ethnic background. 31.1% (n=249) of the sample had an income below $14,999. 86.7% (n=694) report having either public or private health insurance. For usual source of healthcare, 44.6% (n=357) of the sample reported a private doctor’s office, 16.3% (n=130) reported a community health center, and 7.4% (n=59) reported an urgent care facility, and 7.6% (n=61) reported not having a usual source of healthcare. 56.4% (n=451) of the sample indicated a preference for coordinated healthcare. 54% (n=334) of the sample were very satisfied with their healthcare. Findings from multivariable logistical regression models indicate that participants with higher incomes (AOR=0.54, 95% CI 0.36-0.81, p < 0.01) and participants with a PCP (AOR=0.12, 95% CI 0.07-0.20, p < 0.001) were less likely to use a walk-in facility as their usual source of healthcare. Results from the second multivariable logistic regression model indicated that participants who experienced discrimination in a healthcare setting were less likely to prefer coordinated healthcare (AOR=0.63, 95% CI 0.42-0.96, p < 0.05). In the final multivariable logistic model, results indicated that participants who had disclosed their sexual orientation to their PCP (AOR=2.57, 95% CI 1.25-5.21, p < 0.01) and were comfortable discussing their sexual activity with their PCP (AOR=8.04, 95% CI 4.76-13.58, p < 0.001) were more likely to agree that their PCP adequately addressed the healthcare needs of gay men. Conclusion: Understanding healthcare usage and satisfaction among YAGM is necessary as the healthcare landscape changes, especially given the relatively recent addition of urgent care facilities. The type of healthcare facility used as a usual source of care influences the ability to seek comprehensive and coordinated healthcare services. While coordinated primary and sexual healthcare may be ideal, individual preference for this coordination among YAGM is desired but may be limited due to experiences of discrimination in primary care settings.Keywords: healthcare policy, gay men, healthcare access, Affordable Care Act
Procedia PDF Downloads 2398632 Application of Logistics Regression Model to Ascertain the Determinants of Food Security among Households in Maiduguri, Metropolis, Borno State, Nigeria
Authors: Abdullahi Yahaya Musa, Harun Rann Bakari
Abstract:
The study examined the determinants of food security among households in Maiduguri, Metropolis, Borno State, Nigeria. The objectives of the study are to: examine the determinants of food security among households; identify the coping strategies employed by food-insecure households in Maiduguri, Metropolis, Borno State, Nigeria. The population of the study is 843,964 respondents out of which 400 respondents were sampled. The study used a self-developed questionnaire to collect data from four hundred (400) respondents. Four hundred (400) copies of questionnaires were administered and all were retrieved, making 100% return rate. The study employed descriptive and inferential statistics for data analysis. Descriptive statistics (frequency counts and percentages) was used to analyze the socio-economic characteristics of the respondents and objective four, while inferential statistics (logit regression analysis) was used to analyze one. Four hundred (400) copies of questionnaires were administered and all the four hundred (400) were retrieved, making a 100% return rate. The results were presented in tables and discussed according to the research objectives. The study revealed that HHA, HHE, HHSZ, HHSX, HHAS, HHI, HHFS, HHFE, HHAC and HHCDR were the determinants of food security in Maiduguri Metropolis. Relying on less preferred foods, purchasing food on credit, limiting food intake to ensure children get enough, borrowing money to buy foodstuffs, relying on help from relatives or friends outside the household, adult family members skipping or reducing a meal because of insufficient finances and ration money to household members to buy street food were the coping strategies employed by food-insecure households in Maiduguri metropolis. The study recommended that Nigeria Government should intensify the fight against the Boko haram insurgency. This will put an end to Boko Haram Insurgency and enable farmers to return to farming in Borno state.Keywords: internally displaced persons, food security, coping strategies, descriptive statistics, logistics regression model, odd ratio
Procedia PDF Downloads 1478631 Feigenbaum Universality, Chaos and Fractal Dimensions in Discrete Dynamical Systems
Authors: T. K. Dutta, K. K. Das, N. Dutta
Abstract:
The salient feature of this paper is primarily concerned with Ricker’s population model: f(x)=x e^(r(1-x/k)), where r is the control parameter and k is the carrying capacity, and some fruitful results are obtained with the following objectives: 1) Determination of bifurcation values leading to a chaotic region, 2) Development of Statistical Methods and Analysis required for the measure of Fractal dimensions, 3) Calculation of various fractal dimensions. These results also help that the invariant probability distribution on the attractor, when it exists, provides detailed information about the long-term behavior of a dynamical system. At the end, some open problems are posed for further research.Keywords: Feigenbaum universality, chaos, Lyapunov exponent, fractal dimensions
Procedia PDF Downloads 3028630 Designing the Lesson Instructional Plans for Exploring the STEM Education and Creative Learning Processes to Students' Logical Thinking Abilities with Different Learning Outcomes in Chemistry Classes
Authors: Pajaree Naramitpanich, Natchanok Jansawang, Panwilai Chomchid
Abstract:
The aims of this are compared between the students’ logical thinking abilities of their learning for designing the 5-lesson instructional plans of the 2-instructional methods, namely; the STEM Education and the Creative Learning Process (CLP) for developing students’ logical thinking abilities that a sample consisted of 90 students from two chemistry classes of different learning outcomes in Wapi Phathum School with the cluster random sampling technique was used at the 11th grade level. To administer of their learning environments with the 45-experimenl student group by the STEM Education method and the 45-controlling student group by the Creative Learning Process. These learning different groups were obtained using the 5 instruments; the 5-lesson instructional plans of the STEM Education and the Creative Learning Process to enhance the logical thinking tests on Mineral issue were used. The efficiency of the Creative Learning Processes (CLP) Model and the STEM Education’s innovations of these each five instructional lesson plans based on criteria are higher than of 80/80 standard level with the IOC index from the expert educators. The averages mean scores of students’ learning achievement motives were assessed with the Pre and Post Techniques and Logical Thinking Ability Test (LTAT) and dependent t-test analysis were differentiated between the CLP and the STEM, significantly. Students’ perceptions of their chemistry classroom environment inventories with the MCI with the CLP and the STEM methods also were found, differently. Associations between students’ perceptions of their chemistry classroom learning environment inventories on the CLP Model and the STEM Education learning designs toward their logical thinking abilities toward chemistry, the predictive efficiency of R2 values indicate that 68% and 76% of the variances in students’ logical thinking abilities toward chemistry to their controlling and experimental chemistry classroom learning environmental groups with the MCI were correlated at .05 levels, significantly. Implementations of this result are showed the students’ learning by the CLP of the potential thinking life-changing roles in most their logical thinking abilities that it is revealed that the students perceive their abilities to be highly learning achievement in chemistry group are differentiated with the STEM education of students’ outcomes.Keywords: design, the lesson instructional plans, the stem education, the creative learning process, logical thinking ability, different, learning outcome, student, chemistry class
Procedia PDF Downloads 3218629 A Method for Clinical Concept Extraction from Medical Text
Authors: Moshe Wasserblat, Jonathan Mamou, Oren Pereg
Abstract:
Natural Language Processing (NLP) has made a major leap in the last few years, in practical integration into medical solutions; for example, extracting clinical concepts from medical texts such as medical condition, medication, treatment, and symptoms. However, training and deploying those models in real environments still demands a large amount of annotated data and NLP/Machine Learning (ML) expertise, which makes this process costly and time-consuming. We present a practical and efficient method for clinical concept extraction that does not require costly labeled data nor ML expertise. The method includes three steps: Step 1- the user injects a large in-domain text corpus (e.g., PubMed). Then, the system builds a contextual model containing vector representations of concepts in the corpus, in an unsupervised manner (e.g., Phrase2Vec). Step 2- the user provides a seed set of terms representing a specific medical concept (e.g., for the concept of the symptoms, the user may provide: ‘dry mouth,’ ‘itchy skin,’ and ‘blurred vision’). Then, the system matches the seed set against the contextual model and extracts the most semantically similar terms (e.g., additional symptoms). The result is a complete set of terms related to the medical concept. Step 3 –in production, there is a need to extract medical concepts from the unseen medical text. The system extracts key-phrases from the new text, then matches them against the complete set of terms from step 2, and the most semantically similar will be annotated with the same medical concept category. As an example, the seed symptom concepts would result in the following annotation: “The patient complaints on fatigue [symptom], dry skin [symptom], and Weight loss [symptom], which can be an early sign for Diabetes.” Our evaluations show promising results for extracting concepts from medical corpora. The method allows medical analysts to easily and efficiently build taxonomies (in step 2) representing their domain-specific concepts, and automatically annotate a large number of texts (in step 3) for classification/summarization of medical reports.Keywords: clinical concepts, concept expansion, medical records annotation, medical records summarization
Procedia PDF Downloads 1358628 The Confounding Role of Graft-versus-Host Disease in Animal Models of Cancer Immunotherapy: A Systematic Review
Authors: Hami Ashraf, Mohammad Heydarnejad
Abstract:
Introduction: The landscape of cancer treatment has been revolutionized by immunotherapy, offering novel therapeutic avenues for diverse cancer types. Animal models play a pivotal role in the development and elucidation of these therapeutic modalities. Nevertheless, the manifestation of Graft-versus-Host Disease (GVHD) in such models poses significant challenges, muddling the interpretation of experimental data within the ambit of cancer immunotherapy. This study is dedicated to scrutinizing the role of GVHD as a confounding factor in animal models used for cancer immunotherapy, alongside proposing viable strategies to mitigate this complication. Method: Employing a systematic review framework, this study undertakes a comprehensive literature survey including academic journals in PubMed, Embase, and Web of Science databases and conference proceedings to collate pertinent research that delves into the impact of GVHD on animal models in cancer immunotherapy. The acquired studies undergo rigorous analysis and synthesis, aiming to assess the influence of GVHD on experimental results while identifying strategies to alleviate its confounding effects. Results: Findings indicate that GVHD incidence significantly skews the reliability and applicability of experimental outcomes, occasionally leading to erroneous interpretations. The literature surveyed also sheds light on various methodologies under exploration to counteract the GVHD dilemma, thereby bolstering the experimental integrity in this domain. Conclusion: GVHD's presence critically affects both the interpretation and validity of experimental findings, underscoring the imperative for strategies to curtail its confounding impacts. Current research endeavors are oriented towards devising solutions to this issue, aiming to augment the dependability and pertinence of experimental results. It is incumbent upon researchers to diligently consider and adjust for GVHD's effects, thereby enhancing the translational potential of animal model findings to clinical applications and propelling progress in the arena of cancer immunotherapy.Keywords: graft-versus-host disease, cancer immunotherapy, animal models, preclinical model
Procedia PDF Downloads 518627 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study
Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy
Abstract:
Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy
Procedia PDF Downloads 119