Search results for: event quantification
1378 Modeling the Impact of Controls on Information System Risks
Authors: M. Ndaw, G. Mendy, S. Ouya
Abstract:
Information system risk management helps to reduce or eliminate risk by implementing appropriate controls. In this paper, we propose a quantification model of controls impact on information system risks by automatizing the residual criticality estimation step of FMECA which is based on a inductive reasoning. For this, we defined three equations based on type and maturity of controls. For testing, the values obtained with the model were compared to estimated values given by interlocutors during different working sessions and the result is satisfactory. This model allows an optimal assessment of controls maturity and facilitates risk analysis of information system.Keywords: information system, risk, control, FMECA method
Procedia PDF Downloads 3551377 Competing Risks Modeling Using within Node Homogeneity Classification Tree
Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya
Abstract:
To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree
Procedia PDF Downloads 2731376 Sweet to Bitter Perception Parageusia: Case of Posterior Inferior Cerebellar Artery Territory Diaschisis
Authors: I. S. Gandhi, D. N. Patel, M. Johnson, A. R. Hirsch
Abstract:
Although distortion of taste perception following a cerebrovascular event may seem to be a frivolous consequence of a classic stroke presentation, altered taste perception places patients at an increased risk for malnutrition, weight loss, and depression, all of which negatively impact the quality of life. Impaired taste perception can result from a wide variety of cerebrovascular lesions to various locations, including pons, insular cortices, and ventral posteromedial nucleus of the thalamus. Wallenberg syndrome, also known as a lateral medullary syndrome, has been described to impact taste; however, specific sweet to bitter taste dysgeusia from a territory infarction is an infrequent event; as such, a case is presented. One year prior to presentation, this 64-year-old right-handed woman, suffered a right posterior inferior cerebellar artery aneurysm rupture with resultant infarction, culminating in a ventriculoperitoneal shunt placement. One and half months after this event, she noticed the gradual onset of lack of ability to taste sweet, to eventually all sweet food tasting bitter. Since the onset of her chemosensory problems, the patient has lost 60-pounds. Upon gustatory testing, the patient's taste threshold showed ageusia to sucrose and hydrochloric acid, while normogeusia to sodium chloride, urea, and phenylthiocarbamide. The gustatory cortex is made in part by the right insular cortex as well as the right anterior operculum, which are primarily involved in the sensory taste modalities. In this model, sweet is localized in the posterior-most along with the rostral aspect of the right insular cortex, notably adjacent to the region responsible for bitter taste. The sweet to bitter dysgeusia in our patient suggests the presence of a lesion in this localization. Although the primary lesion in this patient was located in the right medulla of the brainstem, neurodegeneration in the rostal and posterior-most aspect, of the right insular cortex may have occurred due to diaschisis. Diaschisis has been described as neurophysiological changes that occur in remote regions to a focal brain lesion. Although hydrocephalus and vasospasm due to aneurysmal rupture may explain the distal foci of impairment, the gradual onset of dysgeusia is more indicative of diaschisis. The perception of sweet, now tasting bitter, suggests that in the absence of sweet taste reception, the intrinsic bitter taste of food is now being stimulated rather than sweet. In the evaluation and treatment of taste parageusia secondary to cerebrovascular injury, prophylactic neuroprotective measures may be worthwhile. Further investigation is warranted.Keywords: diaschisis, dysgeusia, stroke, taste
Procedia PDF Downloads 1811375 Efficient Elimination of Common Allergens through the Application of Dry Microfine Steam on Innate Surfaces
Authors: O. Rachinel, C. Recchia, M. Bourel, B. Recchia
Abstract:
Dry microfine steam (DMS) technology, developed by Laurastar, was shown to effectively eliminate a range of pathogens such as Sars-CoV-2, E. coli, S. aureus and C. Albicans. The aim of this study was to investigate the effect of DMS technology on allergens. Therefore, the application of the DMS technology was tested on two common allergens (Dermatophagoides pteronyssinus and cat allergen Fel d 1), on different inert surfaces (e.g., cotton), during 2 to 3 seconds. Quantification of the remaining allergens was performed and the reduction rates reached 100% in 3 seconds for D. pteronyssinus and 97,74% in 2 seconds for cat allergens. In conclusion, DMS showed high efficacy in the elimination of common allergens and could be seen as a natural solution to improve domestic hygiene and reduce allergies.Keywords: steam, allergens, dust mites, pollens
Procedia PDF Downloads 1381374 The Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster: A Qualitative study
Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon
Abstract:
In a disaster event, sharing patient information between the pre-hospital Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre- EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors that are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality, and the data were analyzed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system that can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analyzed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospital staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.Keywords: emergency medical teams, communication, information and communication technologies, disaster
Procedia PDF Downloads 1271373 Modeling of Thermo Acoustic Emission Memory Effect in Rocks of Varying Textures
Authors: Vladimir Vinnikov
Abstract:
The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied with a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.Keywords: crack growth, cyclic heating and cooling, rock texture, thermo acoustic emission memory effect
Procedia PDF Downloads 2711372 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays
Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal
Abstract:
Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).Keywords: fault tolerance, FPGA, single event upset, approximate computing
Procedia PDF Downloads 1991371 The Impact of Pediatric Cares, Infections and Vaccines on Community and People’s Lives
Authors: Nashed Atef Nashed Farag
Abstract:
Introduction: Reporting adverse events following vaccination remains a challenge. WHO has mandated pharmacovigilance centers around the world to submit Adverse Events Following Immunization (AEFI) reports from different countries to a large electronic database of adverse drug event data called Vigibase. Despite sufficient information about AEFIs on Vigibase, they are not available to the general public. However, the WHO has an alternative website called VigiAccess, an open-access website that serves as an archive for reported adverse reactions and AEFIs. The aim of the study was to establish a reporting model for a number of commonly used vaccines in the VigiAccess system. Methods: On February 5, 2018, VigiAccess comprehensively searched for ESSI reports on the measles vaccine, oral polio vaccine (OPV), yellow fever vaccine, pneumococcal vaccine, rotavirus vaccine, meningococcal vaccine, tetanus vaccine, and tuberculosis vaccine (BCG). These are reports from all pharmacovigilance centers around the world since they joined the WHO Drug Monitoring Program. Results: After an extensive search, VigiAccess found 9,062 AEFIs from the measles vaccine, 185,829 AEFIs from the OPV vaccine, 24,577 AEFIs from the yellow fever vaccine, 317,208 AEFIs from the pneumococcal vaccine, 73,513 AEFIs from the rotavirus vaccine, and 145,447 AEFIs from meningococcal cal vaccine, 22,781 EI FI vaccines against tetanus and 35,556 BCG vaccines against AEFI. Conclusion: The study found that among the eight vaccines examined, pneumococcal vaccines were associated with the highest number of AEFIs, while measles vaccines were associated with the fewest AEFIs.Keywords: surgical approach, anatomical approach, decompression, axillary nerve, quadrangular space adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK vaccines, adverse reactions, VigiAccess, adverse event reporting
Procedia PDF Downloads 731370 Simultaneous Bilateral Patella Tendon Rupture: A Systematic Review
Authors: André Rui Coelho Fernandes, Mariana Rufino, Divakar Hamal, Amr Sousa, Emma Fossett, Kamalpreet Cheema
Abstract:
Aim: A single patella tendon rupture is relatively uncommon, but a simultaneous bilateral event is a rare occurrence and has been scarcely reviewed in the literature. This review was carried out to analyse the existing literature on this event, with the aim of proposing a standardised approach to the diagnosis and management of this injury. Methods: A systematic review was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Three independent reviewers conducted searches in PubMed, OvidSP for Medline and Embase, as well as Cochrane Library using the same search strategy. From a total of 183 studies, 45 were included, i.e. 90 patellas. Results: 46 patellas had a Type 1 Rupture equating to 51%, with Type 3 being the least common, with only 7 patellas sustaining this injury. The mean Insall-Salvio ratio for each knee was 1.62 (R) and 1.60 (L) Direct Primary Repair was the most common surgical technique compared to Tendon Reconstruction, with End to End and Transosseous techniques split almost equally. Brace immobilisation was preferred over cast, with a mean start to weight-bearing of 3.23 weeks post-op. Conclusions: Bilateral patellar tendon rupture is a rare injury that should be considered in patients with knee extensor mechanism disruption. The key limitation of this study was the low number of patients encompassed by the eligible literature. There is space for a higher level of evidence study, specifically regarding surgical treatment choice and methods, as well as post-operative management, which could potentially improve the outcomes in the management of this injury.Keywords: trauma and orthopaedic surgery, bilateral patella, tendon rupture, trauma
Procedia PDF Downloads 1381369 Peril´s Environment of Energetic Infrastructure Complex System, Modelling by the Crisis Situation Algorithms
Authors: Jiří F. Urbánek, Alena Oulehlová, Hana Malachová, Jiří J. Urbánek Jr.
Abstract:
Crisis situations investigation and modelling are introduced and made within the complex system of energetic critical infrastructure, operating on peril´s environments. Every crisis situations and perils has an origin in the emergency/ crisis event occurrence and they need critical/ crisis interfaces assessment. Here, the emergency events can be expected - then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping; or it may be unexpected - without pre-prepared scenario of event. But the both need operational coping by means of crisis management as well. The operation, forms, characteristics, behaviour and utilization of crisis management have various qualities, depending on real critical infrastructure organization perils, and prevention training processes. An aim is always - better security and continuity of the organization, which successful obtainment needs to find and investigate critical/ crisis zones and functions in critical infrastructure organization models, operating in pertinent perils environment. Our DYVELOP (Dynamic Vector Logistics of Processes) method is disposables for it. Here, it is necessary to derive and create identification algorithm of critical/ crisis interfaces. The locations of critical/ crisis interfaces are the flags of crisis situation in organization of critical infrastructure models. Then, the model of crisis situation will be displayed at real organization of Czech energetic crisis infrastructure subject in real peril environment. These efficient measures are necessary for the infrastructure protection. They will be derived for peril mitigation, crisis situation coping and for environmentally friendly organization survival, continuity and its sustainable development advanced possibilities.Keywords: algorithms, energetic infrastructure complex system, modelling, peril´s environment
Procedia PDF Downloads 4031368 Impact of Unusual Dust Event on Regional Climate in India
Authors: Kanika Taneja, V. K. Soni, Kafeel Ahmad, Shamshad Ahmad
Abstract:
A severe dust storm generated from a western disturbance over north Pakistan and adjoining Afghanistan affected the north-west region of India between May 28 and 31, 2014, resulting in significant reductions in air quality and visibility. The air quality of the affected region degraded drastically. PM10 concentration peaked at a very high value of around 1018 μgm-3 during dust storm hours of May 30, 2014 at New Delhi. The present study depicts aerosol optical properties monitored during the dust days using ground based multi-wavelength Sky radiometer over the National Capital Region of India. High Aerosol Optical Depth (AOD) at 500 nm was observed as 1.356 ± 0.19 at New Delhi while Angstrom exponent (Alpha) dropped to 0.287 on May 30, 2014. The variation in the Single Scattering Albedo (SSA) and real n(λ) and imaginary k(λ) parts of the refractive index indicated that the dust event influences the optical state to be more absorbing. The single scattering albedo, refractive index, volume size distribution and asymmetry parameter (ASY) values suggested that dust aerosols were predominant over the anthropogenic aerosols in the urban environment of New Delhi. The large reduction in the radiative flux at the surface level caused significant cooling at the surface. Direct Aerosol Radiative Forcing (DARF) was calculated using a radiative transfer model during the dust period. A consistent increase in surface cooling was evident, ranging from -31 Wm-2 to -82 Wm-2 and an increase in heating of the atmosphere from 15 Wm-2 to 92 Wm-2 and -2 Wm-2 to 10 Wm-2 at top of the atmosphere.Keywords: aerosol optical properties, dust storm, radiative transfer model, sky radiometer
Procedia PDF Downloads 3781367 Extreme Value Theory Applied in Reliability Analysis: Case Study of Diesel Generator Fans
Authors: Jelena Vucicevic
Abstract:
Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. In this paper, the results for the reliability of diesel generator fans were calculated through Extreme Value Theory. The Extreme Value Theory is not widely used in the engineering field. Its usage is well known in other areas such as hydrology, meteorology, finance. The significance of this theory is in the fact that unlike the other statistical methods it is focused on rare and extreme values, and not on average. It should be noted that this theory is not designed exclusively for extreme events, but for extreme values in any event. Therefore, this is a great opportunity to apply the theory and test if it could be applied in this situation. The significance of the work is the calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know the time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. The results achieved in this method will show the approximation of time for which the fans will work as they should, and the percentage of probability of fans working more than certain estimated time. Extreme Value Theory can be applied not only for rare and extreme events, but for any event that has values which we can consider as extreme.Keywords: extreme value theory, lifetime, reliability analysis, statistic, time to failure
Procedia PDF Downloads 3291366 Non-Uniform Filter Banks-based Minimum Distance to Riemannian Mean Classifition in Motor Imagery Brain-Computer Interface
Authors: Ping Tan, Xiaomeng Su, Yi Shen
Abstract:
The motion intention in the motor imagery braincomputer interface is identified by classifying the event-related desynchronization (ERD) and event-related synchronization ERS characteristics of sensorimotor rhythm (SMR) in EEG signals. When the subject imagines different limbs or different parts moving, the rhythm components and bandwidth will change, which varies from person to person. How to find the effective sensorimotor frequency band of subjects is directly related to the classification accuracy of brain-computer interface. To solve this problem, this paper proposes a Minimum Distance to Riemannian Mean Classification method based on Non-Uniform Filter Banks. During the training phase, the EEG signals are decomposed into multiple different bandwidt signals by using multiple band-pass filters firstly; Then the spatial covariance characteristics of each frequency band signal are computered to be as the feature vectors. these feature vectors will be classified by the MDRM (Minimum Distance to Riemannian Mean) method, and cross validation is employed to obtain the effective sensorimotor frequency bands. During the test phase, the test signals are filtered by the bandpass filter of the effective sensorimotor frequency bands, and the extracted spatial covariance feature vectors will be classified by using the MDRM. Experiments on the BCI competition IV 2a dataset show that the proposed method is superior to other classification methods.Keywords: non-uniform filter banks, motor imagery, brain-computer interface, minimum distance to Riemannian mean
Procedia PDF Downloads 1261365 Next Generation UK Storm Surge Model for the Insurance Market: The London Case
Authors: Iacopo Carnacina, Mohammad Keshtpoor, Richard Yablonsky
Abstract:
Non-structural protection measures against flooding are becoming increasingly popular flood risk mitigation strategies. In particular, coastal flood insurance impacts not only private citizens but also insurance and reinsurance companies, who may require it to retain solvency and better understand the risks they face from a catastrophic coastal flood event. In this context, a framework is presented here to assess the risk for coastal flooding across the UK. The area has a long history of catastrophic flood events, including the Great Flood of 1953 and the 2013 Cyclone Xaver storm, both of which led to significant loss of life and property. The current framework will leverage a technology based on a hydrodynamic model (Delft3D Flexible Mesh). This flexible mesh technology, coupled with a calibration technique, allows for better utilisation of computational resources, leading to higher resolution and more detailed results. The generation of a stochastic set of extra tropical cyclone (ETC) events supports the evaluation of the financial losses for the whole area, also accounting for correlations between different locations in different scenarios. Finally, the solution shows a detailed analysis for the Thames River, leveraging the information available on flood barriers and levees. Two realistic disaster scenarios for the Greater London area are simulated: In the first scenario, the storm surge intensity is not high enough to fail London’s flood defences, but in the second scenario, London’s flood defences fail, highlighting the potential losses from a catastrophic coastal flood event.Keywords: storm surge, stochastic model, levee failure, Thames River
Procedia PDF Downloads 2321364 Indirect Genotoxicity of Diesel Engine Emission: An in vivo Study Under Controlled Conditions
Authors: Y. Landkocz, P. Gosset, A. Héliot, C. Corbière, C. Vendeville, V. Keravec, S. Billet, A. Verdin, C. Monteil, D. Préterre, J-P. Morin, F. Sichel, T. Douki, P. J. Martin
Abstract:
Air Pollution produced by automobile traffic is one of the main sources of pollutants in urban atmosphere and is largely due to exhausts of the diesel engine powered vehicles. The International Agency for Research on Cancer, which is part of the World Health Organization, classified in 2012 diesel engine exhaust as carcinogenic to humans (Group 1), based on sufficient evidence that exposure is associated with an increased risk for lung cancer. Amongst the strategies aimed at limiting exhausts in order to take into consideration the health impact of automobile pollution, filtration of the emissions and use of biofuels are developed, but their toxicological impact is largely unknown. Diesel exhausts are indeed complex mixtures of toxic substances difficult to study from a toxicological point of view, due to both the necessary characterization of the pollutants, sampling difficulties, potential synergy between the compounds and the wide variety of biological effects. Here, we studied the potential indirect genotoxicity of emission of Diesel engines through on-line exposure of rats in inhalation chambers to a subchronic high but realistic dose. Following exposure to standard gasoil +/- rapeseed methyl ester either upstream or downstream of a particle filter or control treatment, rats have been sacrificed and their lungs collected. The following indirect genotoxic parameters have been measured: (i) telomerase activity and telomeres length associated with rTERT and rTERC gene expression by RT-qPCR on frozen lungs, (ii) γH2AX quantification, representing double-strand DNA breaks, by immunohistochemistry on formalin fixed-paraffin embedded (FFPE) lung samples. These preliminary results will be then associated with global cellular response analyzed by pan-genomic microarrays, monitoring of oxidative stress and the quantification of primary DNA lesions in order to identify biological markers associated with a potential pro-carcinogenic response of diesel or biodiesel, with or without filters, in a relevant system of in vivo exposition.Keywords: diesel exhaust exposed rats, γH2AX, indirect genotoxicity, lung carcinogenicity, telomerase activity, telomeres length
Procedia PDF Downloads 3901363 Applying the Global Trigger Tool in German Hospitals: A Retrospective Study in Surgery and Neurosurgery
Authors: Mareen Brosterhaus, Antje Hammer, Steffen Kalina, Stefan Grau, Anjali A. Roeth, Hany Ashmawy, Thomas Gross, Marcel Binnebosel, Wolfram T. Knoefel, Tanja Manser
Abstract:
Background: The identification of critical incidents in hospitals is an essential component of improving patient safety. To date, various methods have been used to measure and characterize such critical incidents. These methods are often viewed by physicians and nurses as external quality assurance, and this creates obstacles to the reporting events and the implementation of recommendations in practice. One way to overcome this problem is to use tools that directly involve staff in measuring indicators of quality and safety of care in the department. One such instrument is the global trigger tool (GTT), which helps physicians and nurses identify adverse events by systematically reviewing randomly selected patient records. Based on so-called ‘triggers’ (warning signals), indications of adverse events can be given. While the tool is already used internationally, its implementation in German hospitals has been very limited. Objectives: This study aimed to assess the feasibility and potential of the global trigger tool for identifying adverse events in German hospitals. Methods: A total of 120 patient records were randomly selected from two surgical, and one neurosurgery, departments of three university hospitals in Germany over a period of two months per department between January and July, 2017. The records were reviewed using an adaptation of the German version of the Institute for Healthcare Improvement Global Trigger Tool to identify triggers and adverse event rates per 1000 patient days and per 100 admissions. The severity of adverse events was classified using the National Coordinating Council for Medication Error Reporting and Prevention. Results: A total of 53 adverse events were detected in the three departments. This corresponded to adverse event rates of 25.5-72.1 per 1000 patient-days and from 25.0 to 60.0 per 100 admissions across the three departments. 98.1% of identified adverse events were associated with non-permanent harm without (Category E–71.7%) or with (Category F–26.4%) the need for prolonged hospitalization. One adverse event (1.9%) was associated with potentially permanent harm to the patient. We also identified practical challenges in the implementation of the tool, such as the need for adaptation of the global trigger tool to the respective department. Conclusions: The global trigger tool is feasible and an effective instrument for quality measurement when adapted to the departmental specifics. Based on our experience, we recommend a continuous use of the tool thereby directly involving clinicians in quality improvement.Keywords: adverse events, global trigger tool, patient safety, record review
Procedia PDF Downloads 2511362 Upper Jurassic Foraminiferal Assemblages and Palaeoceanographical Changes in the Central Part of the East European Platform
Authors: Clementine Colpaert, Boris L. Nikitenko
Abstract:
The Upper Jurassic foraminiferal assemblages of the East European Platform have been strongly investigated through the 20th century with biostratigraphical and in smaller degree palaeoecological and palaeobiogeographical purposes. Over the Late Jurassic, the platform was a shallow epicontinental sea that extended from Tethys to the Artic through the Pechora Sea and further toward the northeast in the West Siberian Sea. Foraminiferal assemblages of the Russian Sea were strongly affected by sea-level changes and were controlled by alternated Boreal to Peritethyan influences. The central part of the East European Platform displays very rich and diverse foraminiferal assemblages. Two sections have been analyzed; the Makar'yev Section in the Moscow Depression and the Gorodishi Section in the Yl'yanovsk Depression. Based on the evolution of foraminiferal assemblages, palaeoenvironment has been reconstructed, and sea-level changes have been refined. The aim of this study is to understand palaeoceanographical changes throughout the Oxfordian – Kimmeridgian of the central part of the Russian Sea. The Oxfordian was characterized by a general transgressive event with intermittency of small regressive phases. The platform was connected toward the south with Tethys and Peritethys. During the Middle Oxfordian, opening of a pathway of warmer water from the North-Tethys region to the Boreal Realm favoured the migration of planktonic foraminifera and the appearance of new benthic taxa. It is associated with increased temperature and primary production. During the Late Oxfordian, colder water inputs associated with the microbenthic community crisis may be a response to the closure of this warm-water corridor and the disappearance of planktonic foraminifera. The microbenthic community crisis is probably due to the increased sedimentation rate in the transition from the maximum flooding surface to a second-order regressive event, increasing productivity and inputs of organic matter along with sharp decrease of oxygen into the sediment. It is following during the Early Kimmeridgian by a replacement of foraminiferal assemblages. The almost all Kimmeridgian is characterized by the abundance of many common with Boreal and Subboreal Realm. Connections toward the South began again dominant after a small regressive event recorded during the Late Kimmeridgian and associated with the abundance of many common taxa with Subboreal Realm and Peritethys such as Crimea and Caucasus taxa. Foraminiferal assemblages of the East European Platform are strongly affected by palaeoecological changes and may display a very good model for biofacies typification under Boreal and Subboreal environments. The East European Platform appears to be a key area for the understanding of Upper Jurassic big scale palaeoceanographical changes, being connected with Boreal to Peritethyan basins.Keywords: foraminifera, palaeoceanography, palaeoecology, upper jurassic
Procedia PDF Downloads 2481361 Biochemical Characterization of Meat Goat in Algeria
Authors: Hafid Nadia, Meziane Toufik
Abstract:
The aim of this study was the characterization of the goat meat by the determination of quantity and the quality in Batna region. The first part was the evaluation of production and consumption. The investigations show that the goat meat third after mutton and beef, it’s especially consumed by the indigenous population located in the Mountain and steep area. The second part of this review treats nutritional quality of this meat by the quantification of the chemical composition, including fat profile, and establishes a link between animal age and the values of these parameters. Moisture, fat contents, and cholesterol levels varied with age. Because of the decreasing level of cholesterol in the Chevon meat, it is more recommended for consumption to prevent or reduce the incidence of coronary disease and heart attack.Keywords: biochemical composition, cholesterol, goat meat, heart attack
Procedia PDF Downloads 6701360 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method
Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay
Abstract:
This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.Keywords: agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition
Procedia PDF Downloads 2641359 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 1011358 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV
Authors: Manjit Singh
Abstract:
Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency
Procedia PDF Downloads 2711357 Illness-Related PTSD Among Type 1 Diabetes Patients
Authors: Omer Zvi Shaked, Amir Tirosh
Abstract:
Type 1 Diabetes (T1DM) is an incurable chronic illness with no known preventive measures. Excess to insulin therapy can lead to hypoglycemia with neuro-glycogenic symptoms such as shakiness, nausea, sweating, irritability, fatigue, excessive thirst or hunger, weakness, seizure, and coma. Severe Hypoglycemia (SH) is also considered a most aversive event since it may put patients at risk for injury and death, which matches the criteria of a traumatic event. SH has a ranging prevalence of 20%, which makes it a primary medical Issue. One of the results of SH is an intense emotional fear reaction resembling the form of post-traumatic stress symptoms (PTS), causing many patients to avoid insulin therapy and social activities in order to avoid the possibility of hypoglycemia. As a result, they are at risk for irreversible health deterioration and medical complications. Fear of Hypoglycemia (FOH) is, therefore, a major disturbance for T1DM patients. FOH differs from prevalent post-traumatic stress reactions to other forms of traumatic events since the threat to life continuously exists in the patient's body. That is, it is highly probable that orthodox interventions may not be sufficient for helping patients after SH to regain healthy social function and proper medical treatment. Accordingly, the current presentation will demonstrate the results of a study conducted among T1DM patients after SH. The study was designed in two stages. First, a preliminary qualitative phenomenological study among ten patients after SH was conducted. Analysis revealed that after SH, patients confuse between stress symptoms and Hypoglycemia symptoms, divide life before and after the event, report a constant sense of fear, a loss of freedom, a significant decrease in social functioning, a catastrophic thinking pattern, a dichotomous split between the self and the body, and internalization of illness identity, a loss of internal locus of control, a damaged self-representation, and severe loneliness for never being understood by others. The second stage was a two steps study of intervention among five patients after SH. The first part of the intervention included three months of therapeutic 3rd wave CBT therapy. The contents of the therapeutic process were: acceptance of fear and tolerance to stress; cognitive de-fusion combined with emotional self-regulation; the adoption of an active position relying on personal values; and self-compassion. Then, the intervention included a one-week practical real-time 24/7 support by trained medical personnel, alongside a gradual exposure to increased insulin therapy in a protected environment. The results of the intervention are a decrease in stress symptoms, increased social functioning, increased well-being, and decreased avoidance of medical treatment. The presentation will discuss the unique emotional state of T1DM patients after SH. Then, the presentation will discuss the effectiveness of the intervention for patients with chronic conditions after a traumatic event. The presentation will make evident the unique situation of illness-related PTSD. The presentation will also demonstrate the requirement for multi-professional collaboration between social work and medical care for populations with chronic medical conditions. Limitations of the study and recommendations for further research will be discussed.Keywords: type 1 diabetes, chronic illness, post-traumatic stress, illness-related PTSD
Procedia PDF Downloads 1771356 Good Governance Complementary to Corruption Abatement: A Cross-Country Analysis
Authors: Kamal Ray, Tapati Bhattacharya
Abstract:
Private use of public office for private gain could be a tentative definition of corruption and most distasteful event of corruption is that it is not there, nor that it is pervasive, but it is socially acknowledged in the global economy, especially in the developing nations. We attempted to assess the interrelationship between the Corruption perception index (CPI) and the principal components of governance indicators as per World Bank like Control of Corruption (CC), rule of law (RL), regulatory quality (RQ) and government effectiveness (GE). Our empirical investigation concentrates upon the degree of reflection of governance indicators upon the CPI in order to single out the most powerful corruption-generating indicator in the selected countries. We have collected time series data on above governance indicators such as CC, RL, RQ and GE of the selected eleven countries from the year of 1996 to 2012 from World Bank data set. The countries are USA, UK, France, Germany, Greece, China, India, Japan, Thailand, Brazil, and South Africa. Corruption Perception Index (CPI) of the countries mentioned above for the period of 1996 to 2012is also collected. Graphical method of simple line diagram against the time series data on CPI is applied for quick view for the relative positions of different trend lines of different nations. The correlation coefficient is enough to assess primarily the degree and direction of association between the variables as we get the numerical data on governance indicators of the selected countries. The tool of Granger Causality Test (1969) is taken into account for investigating causal relationships between the variables, cause and effect to speak of. We do not need to verify stationary test as length of time series is short. Linear regression is taken as a tool for quantification of a change in explained variables due to change in explanatory variable in respect of governance vis a vis corruption. A bilateral positive causal link between CPI and CC is noticed in UK, index-value of CC increases by 1.59 units as CPI increases by one unit and CPI rises by 0.39 units as CC rises by one unit, and hence it has a multiplier effect so far as reduction in corruption is concerned in UK. GE causes strongly to the reduction of corruption in UK. In France, RQ is observed to be a most powerful indicator in reducing corruption whereas it is second most powerful indicator after GE in reducing of corruption in Japan. Governance-indicator like GE plays an important role to push down the corruption in Japan. In China and India, GE is proactive as well as influencing indicator to curb corruption. The inverse relationship between RL and CPI in Thailand indicates that ongoing machineries related to RL is not complementary to the reduction of corruption. The state machineries of CC in S. Africa are highly relevant to reduce the volume of corruption. In Greece, the variations of CPI positively influence the variations of CC and the indicator like GE is effective in controlling corruption as reflected by CPI. All the governance-indicators selected so far have failed to arrest their state level corruptions in USA, Germany and Brazil.Keywords: corruption perception index, governance indicators, granger causality test, regression
Procedia PDF Downloads 3061355 COVID-19 Teaches Probability Risk Assessment
Authors: Sean Sloan
Abstract:
Probability Risk Assessments (PRA) can be a difficult concept for students to grasp. So in searching for different ways to describe PRA to relate it to their lives; COVID-19 came up. The parallels are amazing. Soon students began analyzing acceptable risk with the virus. This helped them to quantify just how dangerous is dangerous. The original lesson was dismissed and for the remainder of the period, the probability of risk, and the lethality of risk became the topic. Spreading events such as a COVID carrier on an airline became analogous to single fault casualties such as a Tsunami. Odds of spreading became odds of backup-diesel-generator failure – like with Fukashima Daiichi. Fatalities of the disease became expected fatalities due to radiation spread. Quantification from this discussion took it from hyperbole and emotion into one where we could rationally base guidelines. It has been one of the most effective educational devices observed.Keywords: COVID, education, probability, risk
Procedia PDF Downloads 1531354 A Dihydropyridine Derivative as a Highly Selective Fluorometric Probe for Quantification of Au3+ Residue in Gold Nanoparticle Solution
Authors: Waroton Paisuwan, Mongkol Sukwattanasinitt, Mamoru Tobisu, Anawat Ajavakom
Abstract:
Novel dihydroquinoline derivatives (DHP and DHP-OH) were synthesized in one pot via a tandem trimerization-cyclization of methylpropiolate. DHP and DHP-OH possess strong blue fluorescence with high quantum efficiencies over 0.70 in aqueous media. DHP-OH displays a remarkable fluorescence quenching selectively to the presence of Au3+ through the oxidation of dihydropyridine to pyridinium ion as confirmed by NMR and HRMS. DHP-OH was used to demonstrate the quantitative analysis of Au3+ in water samples with the limit of detection of 33 ppb and excellent recovery (>95%). This fluorescent probe was also applied for the determination of Au3+ residue in the gold nanoparticle solution and a paper-based sensing strip for the on-site detection of Au3+.Keywords: Gold(III) ion detection, Fluorescent sensor, Fluorescence quenching, Dihydropyridine, Gold nanoparticles (AuNPs)
Procedia PDF Downloads 881353 A Literature Review on the Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster
Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon
Abstract:
In a disaster event, sharing patient information between the pre-hospitals Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre-EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors which are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality and the data were analysed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system which can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analysed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospitals staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.Keywords: communication, emergency communication services, emergency medical teams, emergency physicians, emergency nursing, paramedics, information and communication technology, communication systems
Procedia PDF Downloads 861352 Quantifying Wave Attenuation over an Eroding Marsh through Numerical Modeling
Authors: Donald G. Danmeier, Gian Marco Pizzo, Matthew Brennan
Abstract:
Although wetlands have been proposed as a green alternative to manage coastal flood hazards because of their capacity to adapt to sea level rise and provision of multiple ecological and social co-benefits, they are often overlooked due to challenges in quantifying the uncertainty and naturally, variability of these systems. This objective of this study was to quantify wave attenuation provided by a natural marsh surrounding a large oil refinery along the US Gulf Coast that has experienced steady erosion along the shoreward edge. The vegetation module of the SWAN was activated and coupled with a hydrodynamic model (DELFT3D) to capture two-way interactions between the changing water level and wavefield over the course of a storm event. Since the marsh response to relative sea level rise is difficult to predict, a range of future marsh morphologies is explored. Numerical results were examined to determine the amount of wave attenuation as a function of marsh extent and the relative contributions from white-capping, depth-limited wave breaking, bottom friction, and flexing of vegetation. In addition to the coupled DELFT3D-SWAN modeling of a storm event, an uncoupled SWAN-VEG model was applied to a simplified bathymetry to explore a larger experimental design space. The wave modeling revealed that the rate of wave attenuation reduces for higher surge but was still significant over a wide range of water levels and outboard wave heights. The results also provide insights to the minimum marsh extent required to fully realize the potential wave attenuation so the changing coastal hazards can be managed.Keywords: green infrastructure, wave attenuation, wave modeling, wetland
Procedia PDF Downloads 1321351 Modeling of Thermally Induced Acoustic Emission Memory Effects in Heterogeneous Rocks with Consideration for Fracture Develo
Authors: Vladimir A. Vinnikov
Abstract:
The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied by a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.Keywords: degree of rock disturbance, non-destructive testing, thermally induced acoustic emission memory effects, structure and texture of rocks
Procedia PDF Downloads 2641350 1-Butyl-2,3-Dimethylimidazolium Bis (Trifluoromethanesulfonyl) Imide and Titanium Oxide Based Voltammetric Sensor for the Quantification of Flunarizine Dihydrochloride in Solubilized Media
Authors: Rajeev Jain, Nimisha Jadon, Kshiti Singh
Abstract:
Titanium oxide nanoparticles and 1-butyl-2,3-dimethylimidazolium bis (trifluoromethane- sulfonyl) imide modified glassy carbon electrode (TiO2/IL/GCE) has been fabricated for electrochemical sensing of flunarizine dihydrochloride (FRH). The electrochemical properties and morphology of the prepared nanocomposite were studied by electrochemical impedance spectroscopy (EIS) and transmission electron microscopy (TEM). The response of the electrochemical sensor was found to be proportional to the concentrations of FRH in the range from 0.5 µg mL-1 to 16 µg mL-1. The detection limit obtained was 0.03 µg mL-1. The proposed method was also applied to the determination of FRH in pharmaceutical formulation and human serum with good recoveries.Keywords: flunarizine dihydrochloride, ionic liquid, nanoparticles, voltammetry, human serum
Procedia PDF Downloads 3321349 Bioanalytical Method Development and Validation of Aminophylline in Rat Plasma Using Reverse Phase High Performance Liquid Chromatography: An Application to Preclinical Pharmacokinetics
Authors: S. G. Vasantharaju, Viswanath Guptha, Raghavendra Shetty
Abstract:
Introduction: Aminophylline is a methylxanthine derivative belonging to the class bronchodilator. From the literature survey, reported methods reveals the solid phase extraction and liquid liquid extraction which is highly variable, time consuming, costly and laborious analysis. Present work aims to develop a simple, highly sensitive, precise and accurate high-performance liquid chromatography method for the quantification of Aminophylline in rat plasma samples which can be utilized for preclinical studies. Method: Reverse Phase high-performance liquid chromatography method. Results: Selectivity: Aminophylline and the internal standard were well separated from the co-eluted components and there was no interference from the endogenous material at the retention time of analyte and the internal standard. The LLOQ measurable with acceptable accuracy and precision for the analyte was 0.5 µg/mL. Linearity: The developed and validated method is linear over the range of 0.5-40.0 µg/mL. The coefficient of determination was found to be greater than 0.9967, indicating the linearity of this method. Accuracy and precision: The accuracy and precision values for intra and inter day studies at low, medium and high quality control samples concentrations of aminophylline in the plasma were within the acceptable limits Extraction recovery: The method produced consistent extraction recovery at all 3 QC levels. The mean extraction recovery of aminophylline was 93.57 ± 1.28% while that of internal standard was 90.70 ± 1.30%. Stability: The results show that aminophylline is stable in rat plasma under the studied stability conditions and that it is also stable for about 30 days when stored at -80˚C. Pharmacokinetic studies: The method was successfully applied to the quantitative estimation of aminophylline rat plasma following its oral administration to rats. Discussion: Preclinical studies require a rapid and sensitive method for estimating the drug concentration in the rat plasma. The method described in our article includes a simple protein precipitation extraction technique with ultraviolet detection for quantification. The present method is simple and robust for fast high-throughput sample analysis with less analysis cost for analyzing aminophylline in biological samples. In this proposed method, no interfering peaks were observed at the elution times of aminophylline and the internal standard. The method also had sufficient selectivity, specificity, precision and accuracy over the concentration range of 0.5 - 40.0 µg/mL. An isocratic separation technique was used underlining the simplicity of the presented method.Keywords: Aminophyllin, preclinical pharmacokinetics, rat plasma, RPHPLC
Procedia PDF Downloads 223