Search results for: reduction in potential medical errors due to elimination of transcription errors
19218 An Approach to Solving Some Inverse Problems for Parabolic Equations
Authors: Bolatbek Rysbaiuly, Aliya S. Azhibekova
Abstract:
Problems concerning the interpretation of the well testing results belong to the class of inverse problems of subsurface hydromechanics. The distinctive feature of such problems is that additional information is depending on the capabilities of oilfield experiments. Another factor that should not be overlooked is the existence of errors in the test data. To determine reservoir properties, some inverse problems for parabolic equations were investigated. An approach to solving the inverse problems based on the method of regularization is proposed.Keywords: iterative approach, inverse problem, parabolic equation, reservoir properties
Procedia PDF Downloads 42919217 Carbon Supported Silver Nanostructures for Electrochemical Carbon Dioxide Reduction
Authors: Sonali Panigrahy, Manjunatha K., Sudip Barman
Abstract:
Electrocatalytic reduction methods hold significant promise in addressing the urgent need to mitigate excessive greenhouse gas emissions, particularly carbon dioxide (CO₂). A highly effective catalyst is essential for achieving the conversion of CO₂ into valuable products due to the complex, multi-electron, and multi-product nature of the CO₂ reduction process. The electrochemical reduction of CO₂, driven by renewable energy sources, presents a valuable opportunity for simultaneously reducing CO₂ emissions while generating valuable chemicals and fuels, with syngas being a noteworthy product. Silver-based electrodes have been the focus of extensive research due to their low overpotential and remarkable selectivity in promoting the generation of carbon monoxide (CO) in the electrocatalytic carbon dioxide reduction reaction (CO₂RR). In this study, we delve into the synthesis of carbon-supported silver nanoparticles (Ag/C), which serve as efficient electrocatalysts for the reduction of CO₂. The as-prepared catalyst, Ag/C, is not only cost-effective but also highly proficient in facilitating the conversion of CO₂ and H₂O into syngas, which is a customizable mixture of hydrogen (H₂) and carbon monoxide (CO). The highest faradic efficiency for the production of CO on Ag/C was calculated to be 56.4% at -1.4 V vs Ag/AgCl. The maximum partial current density for the generation of CO was determined to be -9.4 mA cm-2 at a potential of -1.6 V vs Ag/AgCl. This research demonstrates the potential of Ag/C as an electrocatalyst to enable the sustainable production of syngas, contributing to the reduction of CO₂ emissions and the synthesis of valuable chemical precursors and fuels.Keywords: CO₂, carbon monooxide, electrochemical, silver
Procedia PDF Downloads 7019216 Correction Factors for Soil-Structure Interaction Predicted by Simplified Models: Axisymmetric 3D Model versus Fully 3D Model
Authors: Fu Jia
Abstract:
The effects of soil-structure interaction (SSI) are often studied using axial-symmetric three-dimensional (3D) models to avoid the high computational cost of the more realistic, fully 3D models, which require 2-3 orders of magnitude more computer time and storage. This paper analyzes the error and presents correction factors for system frequency, system damping, and peak amplitude of structural response computed by axisymmetric models, embedded in uniform or layered half-space. The results are compared with those for fully 3D rectangular foundations of different aspect ratios. Correction factors are presented for a range of the model parameters, such as fixed-base frequency, structure mass, height and length-to-width ratio, foundation embedment, soil-layer stiffness and thickness. It is shown that the errors are larger for stiffer, taller and heavier structures, deeper foundations and deeper soil layer. For example, for a stiff structure like Millikan Library (NS response; length-to-width ratio 1), the error is 6.5% in system frequency, 49% in system damping and 180% in peak amplitude. Analysis of a case study shows that the NEHRP-2015 provisions for reduction of base shear force due to SSI effects may be unsafe for some structures and need revision. The presented correction factor diagrams can be used in practical design and other applications.Keywords: 3D soil-structure interaction, correction factors for axisymmetric models, length-to-width ratio, NEHRP-2015 provisions for reduction of base shear force, rectangular embedded foundations, SSI system frequency, SSI system damping
Procedia PDF Downloads 26719215 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures
Authors: Adriano Z. Zambom, Preethi Ravikumar
Abstract:
One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.Keywords: additive model, nonparametric regression, variable selection, Akaike Information Criteria
Procedia PDF Downloads 26619214 Elimination of Mixed-Culture Biofilms Using Biological Agents
Authors: Anita Vidacs, Csaba Vagvolgyi, Judit Krisch
Abstract:
The attachment of microorganisms to different surfaces and the development of biofilms can lead to outbreaks of food-borne diseases and economic losses due to perished food. In food processing environments, bacterial communities are generally formed by mixed cultures of different species. Plants are sources of several antimicrobial substances that may be potential candidates for the development of new disinfectants. We aimed to investigate cinnamon (Cinnamomum zeylanicum), marjoram (Origanum majorana), and thyme (Thymus vulgaris). Essential oils and their major components (cinnamaldehyde, terpinene-4-ol, and thymol) on four-species biofilms of E. coli, L. monocytogenes, P. putida, and S. aureus. Experiments had three parts: (i) determination of minimum bactericide concentration and the killing time with microdilution methods; (ii) elimination of the four-species 24– and 168-hours old biofilm from stainless steel, polypropylene, tile and wood surfaces; and (iii) comparing the disinfectant effect with industrial used per-acetic based sanitizer (HC-DPE). E. coli and P. putida were more resistant to investigated essential oils and their main components in biofilm, than L. monocytogenes and S. aureus. These Gram-negative bacteria were detected on the surfaces, where the natural based disinfectant had not total biofilm elimination effect. Most promoted solutions were the cinnamon essential oil and the terpinene-4-ol that could eradicate the biofilm from stainless steel, polypropylene and even from tile, too. They have a better disinfectant effect than HC-DPE. These natural agents can be used as alternative solutions in the battle against bacterial biofilms.Keywords: biofilm, essential oils, surfaces, terpinene-4-ol
Procedia PDF Downloads 11219213 Survey of Neonatologists’ Burnout on a Neonatal Surgical Unit: Audit Study from Cairo University Specialized Pediatric Hospital
Authors: Mahmoud Tarek, Alaa Obeida, Mai Magdy, Khalid Hussein, Aly Shalaby
Abstract:
Background: More doctors are complaining of burnout than before, Burnout is a state of physical and mental exhaustion caused by the doctor’s lifestyle, unfortunately, Medical errors are also more likely in those suffering from burnout and these may result in malpractice suits. Methodology: It is a retrospective audit of burnout response on all neonatologists over a 9 months period. We gathered data using burnout questionnaire, it was obtained from 23 physicians, the physicians divided into 5 categories according to the final score of the 28 questions in the questionnaire. Category 1 with score from 28-38 with almost no work stress, category 2 with score (38-50) who express a low amount of job related stress, category 3 with score (51-70) with moderate amount of stress, category 4 with score (71-90) those express a high amount of job stress and begun to burnout, category 5 with score (91 and above) who are under a dangerous amount of stress and advanced stage of burnout. Results: 33 neonatologists have received the questionnaire, 23 responses were sent back with a response rate of 69.6%. The results showed that 61% of physicians fall in category 4, 31% of the physician in category 5, while 8% of physicians equally distributed between category 2 and 3 (4% each of them). On the other hand, there is no physician present in category 1. Conclusion: Burnout is prevalent in SNICUs, So interventions to minimize burnout prevalence may be of greater importance as this may be reflected indirectly on medical conditions of the patients and physicians, efforts should be done to decrease this high rate of burnout.Keywords: Cairo, work overload, exhaustion, surgery, neonatal ICU
Procedia PDF Downloads 21319212 Cytotoxicity and Androgenic Potential of Antifungal Drug Substances on MDA-KB2 Cells
Authors: Benchouala Amira, Bojic Clement, Poupin Pascal, Cossu Leguille-carole
Abstract:
The objective of this study is to evaluate in vitro the cytotoxic and androgenic potential of several antifungal molecules (amphotericin B, econazole, ketoconazole and miconazole) on MDA-Kb2 cell lines. This biological model is an effective tool for the detection of endocrine disruptors because it responds well to the main agonist of the androgen receptor (testosterone) and also to an antagonist: flutamide. The cytotoxicity of each chemical compound tested was measured using an MTT assay (tetrazolium salt, 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) which measures the activity of the reductase function of mitochondrial succinate dehydrogenase enzymes of cultured cells. This complementary cytotoxicity test is essential to ensure that the effects of reduction in luminescence intensity observed during androgenic tests are only attributable to the anti-androgenic action of the compounds tested and not to their possible cytotoxic properties. Tests of the androgenic activity of antifungals show that these compounds do not have the capacity to induce transcription of the luciferase gene. These compounds do not exert an androgenic effect on MDA-Kb2 cells in culture for the environmental concentrations tested. The addition of flutamide for the same tested concentrations of antifungal molecules reduces the luminescence induced by amphotericin B, econazole and miconazole, which is explained by a strong interaction of these molecules with flutamide which may have a greater toxic effect than when tested alone. The cytotoxicity test shows that econazole and ketoconazole can cause cell death at certain concentrations tested. This cell mortality is perhaps induced by a direct or indirect action on deoxyribonucleic acid (DNA), ribonucleic acid (RNA) or proteins necessary for cell division.Keywords: cytotoxicity, androgenic potential, antifungals, MDA-Kb2
Procedia PDF Downloads 5119211 The Different Effects of Mindfulness-Based Relapse Prevention Group Therapy on QEEG Measures in Various Severity Substance Use Disorder Involuntary Clients
Authors: Yu-Chi Liao, Nai-Wen Guo, Chun‑Hung Lee, Yung-Chin Lu, Cheng-Hung Ko
Abstract:
Objective: The incidence of behavioral addictions, especially substance use disorders (SUDs), is gradually be taken seriously with various physical health problems. Mindfulness-based relapse prevention (MBRP) is a treatment option for promoting long-term health behavior change in recent years. MBRP is a structured protocol that integrates formal meditation practices with the cognitive-behavioral approach of relapse prevention treatment by teaching participants not to engage in reappraisal or savoring techniques. However, considering SUDs as a complex brain disease, questionnaires and symptom evaluation are not sufficient to evaluate the effect of MBRP. Neurophysiological biomarkers such as quantitative electroencephalogram (QEEG) may improve accurately represent the curative effects. This study attempted to find out the neurophysiological indicator of MBRP in various severity SUD involuntary clients. Participants and Methods: Thirteen participants (all males) completed 8-week mindfulness-based treatment provided by trained, licensed clinical psychologists. The behavioral data were from the Severity of Dependence Scale (SDS) and Negative Mood Regulation Scale (NMR) before and afterMBRP treatment. The QEEG data were simultaneously recorded with executive attention tasks, called comprehensive nonverbal attention test(CNAT). The two-way repeated-measures (treatment * severity) ANOVA and independent t-test were used for statistical analysis. Results: Thirteen participants regrouped into high substance dependence (HS) and low substance dependence (LS) by SDS cut-off. The HS group showed more SDS total score and lower gamma wave in the Go/No Go task of CNAT at pretest. Both groups showed the main effect that they had a lower frontal theta/beta ratio (TBR) during the simple reaction time task of CNAT. The main effect showed that the delay errors of CNAT were lower after MBRP. There was no other difference in CNAT between groups. However, after MBRP, compared to LS, the HS group have resonant progress in improving SDS and NMR scores. The neurophysiological index, the frontal TBR of the HS during the Go/No Go task of CNATdecreased than that of the LS group. Otherwise, the LS group’s gamma wave was a significant reduction on the Go/No Go task of CNAT. Conclusion: The QEEG data supports the MBRP can restore the prefrontal function of involuntary addicts and lower their errors in executive attention tasks. However, the improvement of MBRPfor the addict with high addiction severity is significantly more than that with low severity, including QEEG’s indicators and negative emotion regulation. Future directions include investigating the reasons for differences in efficacy among different severity of the addiction.Keywords: mindfulness, involuntary clients, QEEG, emotion regulation
Procedia PDF Downloads 14719210 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: bootstrap, edgeworth approximation, IID, quantile
Procedia PDF Downloads 15919209 Stress-Controlled Senescence and Development in Arabidopsis thaliana by Root Associated Factor (RAF), a NAC Transcription Regulator
Authors: Iman Kamranfar, Gang-Ping Xue, Salma Balazadeh, Bernd Mueller-Roeber
Abstract:
Adverse environmental conditions such as salinity stress, high temperature and drought limit plant growth and typically lead to precocious tissue degeneration and leaf senescence, a process by which nutrients from photosynthetic organs are recycled for the formation of flowers and seeds to secure reaching the next generation under such harmful conditions. In addition, abiotic stress affects developmental patterns that help the plant to withstand unfavourable environmental conditions. We discovered an NAC (for NAM, ATAF1, 2, and CUC2) transcription factor (TF), called RAF in the following, which plays a central role in abiotic drought stress-triggered senescence and the control of developmental adaptations to stressful environments. RAF is an ABA-responsive TF; RAF overexpressors are hypersensitive to abscisic acid (ABA) and exhibit precocious senescence while knock-out mutants show delayed senescence. To explore the RAF gene regulatory network (GRN), we determined its preferred DNA binding sites by binding site selection assay (BSSA) and performed microarray-based expression profiling using inducible RAF overexpression lines and chromatin immunoprecipitation (ChIP)-PCR. Our studies identified several direct target genes, including those encoding for catabolic enzymes acting during stress-induced senescence. Furthermore, we identified various genes controlling drought stress-related developmental changes. Based on our results, we conclude that RAF functions as a central transcriptional regulator that coordinates developmental programs with stress-related inputs from the environment. To explore the potential agricultural applications of our findings, we are currently extending our studies towards crop species.Keywords: abiotic stress, Arabidopsis, development, transcription factor
Procedia PDF Downloads 19519208 Impact of Fin Cross Section Shape on Potential Distribution of Nanoscale Trapezoidal FinFETs
Authors: Ahmed Nassim Moulai Khatir
Abstract:
Fin field effect transistors (FinFETs) deliver superior levels of scalability than the classical structure of MOSFETs by offering the elimination of short channel effects. Modern FinFETs are 3D structures that rise above the planar substrate, but some of these structures have inclined surfaces, which results in trapezoidal cross sections instead of rectangular sections usually used. Fin cross section shape of FinFETs results in some device issues, like potential distribution performance. This work analyzes that impact with three-dimensional numeric simulation of several triple-gate FinFETs with various top and bottom widths of fin. Results of the simulation show that the potential distribution and the electrical field in the fin depend on the sidewall inclination angle.Keywords: FinFET, cross section shape, SILVACO, trapezoidal FinFETs
Procedia PDF Downloads 4919207 The Transcriptional Regulation of Human LRWD1 through DNA Methylation
Authors: Yen-Ni Teng, Hsing-Yi Chen, Hsien-An Pan, Yung-Ming Lin, Hany A. Omar, Jui-Hsiang Hung
Abstract:
Leucine-rich repeats and WD repeat domain containing 1 (LRWD1) is highly expressed in the testes of healthy males. On the other hand, LRWD1 is significantly down-regulated in the testicular tissues of patients with severe spermatogenic defects. In our study, the downregulation of LRWD1 expression by shRNA caused a significant reduction of cell growth and mitosis and a noteworthy increase in the cell microtubule atrophy rate. Here, we used EMBOSS CpG plot analysis to explore the promoter region of LRWD1 gene. We found that CpG islands are located between positions -253 to +5 nucleotides upstream from the LRWD1 transcription start site. Luciferase reporter assay revealed that the hypermethylation of the LRWD1 promoter reduced the transcription activity in cells. In addition, quantitative methylation-specific PCR and immunostaining showed that the methylation inhibitor, 5-Aza-2'-deoxycytidine, increased LRWD1 promoter activity, LRWD1 mRNA, protein expression and cell viability. Whereas, the methylation activator, S-adenosylmethionine, caused opposite effects. The overexpression of p53 and Nrf2 in NT2/D1 cells increased LRWD1 promoter activity while 5-fluorodeoxyuridine decreased it. In conclusion, this study highlights evidence that the methylation status of LRWD1 promoter is associated with LRWD1 expression. Since the expression level of LRWD1 plays an important role in spermatogenesis, the methylation status of LRWD1 may serve as a novel molecular diagnostic or therapeutic approach in male's infertility.Keywords: LRWD1, DNA methylation, p53, Nrf2
Procedia PDF Downloads 14819206 Getting It Right Before Implementation: Using Simulation to Optimize Recommendations and Interventions After Adverse Event Review
Authors: Melissa Langevin, Natalie Ward, Colleen Fitzgibbons, Christa Ramsey, Melanie Hogue, Anna Theresa Lobos
Abstract:
Description: Root Cause Analysis (RCA) is used by health care teams to examine adverse events (AEs) to identify causes which then leads to recommendations for prevention Despite widespread use, RCA has limitations. Best practices have not been established for implementing recommendations or tracking the impact of interventions after AEs. During phase 1 of this study, we used simulation to analyze two fictionalized AEs that occurred in hospitalized paediatric patients to identify and understand how the errors occurred and generated recommendations to mitigate and prevent recurrences. Scenario A involved an error of commission (inpatient drug error), and Scenario B involved detecting an error that already occurred (critical care drug infusion error). Recommendations generated were: improved drug labeling, specialized drug kids, alert signs and clinical checklists. Aim: Use simulation to optimize interventions recommended post critical event analysis prior to implementation in the clinical environment. Methods: Suggested interventions from Phase 1 were designed and tested through scenario simulation in the clinical environment (medicine ward or pediatric intensive care unit). Each scenario was simulated 8 times. Recommendations were tested using different, voluntary teams and each scenario was debriefed to understand why the error was repeated despite interventions and how interventions could be improved. Interventions were modified with subsequent simulations until recommendations were felt to have an optimal effect and data saturation was achieved. Along with concrete suggestions for design and process change, qualitative data pertaining to employee communication and hospital standard work was collected and analyzed. Results: Each scenario had a total of three interventions to test. In, scenario 1, the error was reproduced in the initial two iterations and mitigated following key intervention changes. In scenario 2, the error was identified immediately in all cases where the intervention checklist was utilized properly. Independently of intervention changes and improvements, the simulation was beneficial to identify which of these should be prioritized for implementation and highlighted that even the potential solutions most frequently suggested by participants did not always translate into error prevention in the clinical environment. Conclusion: We conclude that interventions that help to change process (epinephrine kit or mandatory checklist) were more successful at preventing errors than passive interventions (signage, change in memory aids). Given that even the most successful interventions needed modifications and subsequent re-testing, simulation is key to optimizing suggested changes. Simulation is a safe, practice changing modality for institutions to use prior to implementing recommendations from RCA following AE reviews.Keywords: adverse events, patient safety, pediatrics, root cause analysis, simulation
Procedia PDF Downloads 15319205 Effects of Machining Parameters on the Surface Roughness and Vibration of the Milling Tool
Authors: Yung C. Lin, Kung D. Wu, Wei C. Shih, Jui P. Hung
Abstract:
High speed and high precision machining have become the most important technology in manufacturing industry. The surface roughness of high precision components is regarded as the important characteristics of the product quality. However, machining chatter could damage the machined surface and restricts the process efficiency. Therefore, selection of the appropriate cutting conditions is of importance to prevent the occurrence of chatter. In addition, vibration of the spindle tool also affects the surface quality, which implies the surface precision can be controlled by monitoring the vibration of the spindle tool. Based on this concept, this study was aimed to investigate the influence of the machining conditions on the surface roughness and the vibration of the spindle tool. To this end, a series of machining tests were conducted on aluminum alloy. In tests, the vibration of the spindle tool was measured by using the acceleration sensors. The surface roughness of the machined parts was examined using white light interferometer. The response surface methodology (RSM) was employed to establish the mathematical models for predicting surface finish and tool vibration, respectively. The correlation between the surface roughness and spindle tool vibration was also analyzed by ANOVA analysis. According to the machining tests, machined surface with or without chattering was marked on the lobes diagram as the verification of the machining conditions. Using multivariable regression analysis, the mathematical models for predicting the surface roughness and tool vibrations were developed based on the machining parameters, cutting depth (a), feed rate (f) and spindle speed (s). The predicted roughness is shown to agree well with the measured roughness, an average percentage of errors of 10%. The average percentage of errors of the tool vibrations between the measurements and the predictions of mathematical model is about 7.39%. In addition, the tool vibration under various machining conditions has been found to have a positive influence on the surface roughness (r=0.78). As a conclusion from current results, the mathematical models were successfully developed for the predictions of the surface roughness and vibration level of the spindle tool under different cutting condition, which can help to select appropriate cutting parameters and to monitor the machining conditions to achieve high surface quality in milling operation.Keywords: machining parameters, machining stability, regression analysis, surface roughness
Procedia PDF Downloads 23219204 Enhancing Code Security with AI-Powered Vulnerability Detection
Authors: Zzibu Mark Brian
Abstract:
As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.Keywords: AI, machine language, cord security, machine leaning
Procedia PDF Downloads 4019203 Decision Making in Medicine and Treatment Strategies
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.Keywords: decision making, medicine, treatment strategies, patient
Procedia PDF Downloads 57919202 Solubility of Carbon Dioxide in Methoxy and Nitrile-Functionalized Ionic Liquids
Authors: D. A. Bruzon, G. Tapang, I. S. Martinez
Abstract:
Global warming and climate change are significant environmental concerns, which require immediate global action in carbon emission mitigation. The capture, sequestration, and conversion of carbon dioxide to other products such as methane or ethanol are ways to control excessive emissions. Ionic liquids have shown great potential among the materials studied as carbon capture solvents and catalysts in the reduction of CO2. In this study, ionic liquids comprising of a methoxy (-OCH3) and cyano (-CN) functionalized imidazolium cation, [MOBMIM] and [CNBMIM] respectively, paired with tris(pentafluoroethyl)trifluorophosphate [FAP] anion were evaluated as effective capture solvents, and organocatalysts in the reduction of CO2. An in-situ electrochemical set-up, which can measure controlled amounts of CO2 both in the gas and in the ionic liquid phase, was used. Initially, reduction potentials of CO2 in the CO2-saturated ionic liquids containing the internal standard cobaltocene were determined using cyclic voltammetry. Chronoamperometric transients were obtained at potentials slightly less negative than the reduction potentials of CO2 in each ionic liquid. The time-dependent current response was measured under a controlled atmosphere. Reduction potentials of CO2 in methoxy and cyano-functionalized [FAP] ionic liquids were observed to occur at ca. -1.0 V (vs. Cc+/Cc), which was significantly lower compared to the non-functionalized analog [PMIM][FAP], with an observed reduction potential of CO2 at -1.6 V (vs. Cc+/Cc). This decrease in the potential required for CO2 reduction in the functionalized ionic liquids shows that the functional groups methoxy and cyano effectively decreased the free energy of formation of the radical anion CO2●⁻, suggesting that these electrolytes may be used as organocatalysts in the reduction of the greenhouse gas. However, upon analyzing the solubility of the gas in each ionic liquid, [PMIM][FAP] showed the highest absorption capacity, at 4.81 mM under saturated conditions, compared to [MOBMIM][FAP] at 1.86 mM, and [CNBMIM][FAP] at 0.76 mM. Also, calculated Henry’s constant determined from the concentration-pressure graph of each functionalized ionic liquid shows that the groups -OCH3 and -CN attached terminal to a C4 alkyl chain do not significantly improve CO2 solubility.Keywords: carbon capture, CO2 reduction, electrochemistry, ionic liquids
Procedia PDF Downloads 40419201 Comparison of the Hospital Patient Safety Culture between Bulgarian, Croatian and American: Preliminary Results
Authors: R. Stoyanova, R. Dimova, M. Tarnovska, T. Boeva, R. Dimov, I. Doykov
Abstract:
Patient safety culture (PSC) is an essential component of quality of healthcare. Improving PSC is considered a priority in many developed countries. Specialized software platform for registration and evaluation of hospital patient safety culture has been developed with the support of the Medical University Plovdiv Project №11/2017. The aim of the study is to assess the status of PSC in Bulgarian hospitals and to compare it to that in USA and Croatian hospitals. Methods: The study was conducted from June 01 to July 31, 2018 using the web-based Bulgarian Version of the Hospital Survey on Patient Safety Culture Questionnaire (B-HSOPSC). Two hundred and forty-eight medical professionals from different hospitals in Bulgaria participated in the study. To quantify the differences of positive scores distributions for each of the 42 HSOPSC items between Bulgarian, Croatian and USA samples, the x²-test was applied. The research hypothesis assumed that there are no significant differences between the Bulgarian, Croatian and US PSCs. Results: The results revealed 14 significant differences in the positive scores between the Bulgarian and Croatian PSCs and 15 between the Bulgarian and the USA PSC, respectively. Bulgarian medical professionals provided less positive responses to 12 items compared with Croatian and USA respondents. The Bulgarian respondents were more positive compared to Croatians on the feedback and communication of medical errors (Items - C1, C4, C5) as well as on the employment of locum staff (A7) and the frequency of reported mistakes (D1). Bulgarian medical professionals were more positive compared with their USA colleagues on the communication of information at shift handover and across hospital units (F5, F7). The distribution of positive scores on items: ‘Staff worries that their mistakes are kept in their personnel file’ (RA16), ‘Things ‘fall between the cracks’ when transferring patients from one unit to another’ (RF3) and ‘Shift handovers are problematic for patients in this hospital’ (RF11) were significantly higher among Bulgarian respondents compared with Croatian and US respondents. Conclusions: Significant differences of positive scores distribution were found between Bulgarian and USA PSC on one hand and between Bulgarian and Croatian on the other. The study reveals that distribution of positive responses could be explained by the cultural, organizational and healthcare system differences.Keywords: patient safety culture, healthcare, HSOPSC, medical error
Procedia PDF Downloads 13619200 Microkinetic Modelling of NO Reduction on Pt Catalysts
Authors: Vishnu S. Prasad, Preeti Aghalayam
Abstract:
The major harmful automobile exhausts are nitric oxide (NO) and unburned hydrocarbon (HC). Reduction of NO using unburned fuel HC as a reductant is the technique used in hydrocarbon-selective catalytic reduction (HC-SCR). In this work, we study the microkinetic modelling of NO reduction using propene as a reductant on Pt catalysts. The selectivity of NO reduction to N2O is detected in some ranges of operating conditions, whereas the effect of inlet O2% causes a number of changes in the feasible regimes of operation.Keywords: microkinetic modelling, NOx, platinum on alumina catalysts, selective catalytic reduction
Procedia PDF Downloads 45819199 Mapping of Renovation Potential in Rudersdal Municipality Based on a Sustainability Indicator Framework
Authors: Barbara Eschen Danielsen, Morten Niels Baxter, Per Sieverts Nielsen
Abstract:
Europe is currently in an energy and climate crisis, which requires more sustainable solutions than what has been used to before. Europe uses 40% of its energy in buildings so there has come a significant focus on trying to find and commit to new initiatives to reduce energy consumption in buildings. The European Union has introduced a building standard in 2021 to be upheld by 2030. This new building standard requires a significant reduction of CO2 emissions from both privately and publicly owned buildings. The overall aim is to achieve a zero-emission building stock by 2050. EU is revising the Energy Performance of Buildings Directive (EPBD) as part of the “Fit for 55” package. It was adopted on March 14, 2023. The new directive’s main goal is to renovate the least energy-efficient homes in Europe. There will be a cost for the home owner with a renovation project, but there will also be an improvement in energy efficiency and, therefore, a cost reduction. After the implementation of the EU directive, many homeowners will have to focus their attention on how to make the most effective energy renovations of their homes. The new EU directive will affect almost one million Danish homes (30%), as they do not meet the newly implemented requirements for energy efficiency. The problem for this one mio homeowners is that it is not easy to decide which renovation project they should consider. The houses are build differently and there are many possible solutions. The main focus of this paper is to identify the most impactful solutions and evaluate their impact and evaluating them with a criteria based sustainability indicator framework. The result of the analysis give each homeowner an insight in the various renovation options, including both advantages and disadvantages with the aim of avoiding unnecessary costs and errors while minimizing their CO2 footprint. Given that the new EU directive impacts a significant number of home owners and their homes both in Denmark and the rest of the European Union it is crucial to clarify which renovations have the most environmental impact and most cost effective. We have evaluated the 10 most impactful solutions and evaluated their impact in an indicator framework which includes 9 indicators and covers economic, environmental as well as social factors. We have packaged the result of the analysis in three packages, the most cost effective (short term), the most cost effective (long-term) and the most sustainable. The results of the study secure transparency and thereby provides homeowners with a tool to help their decision-making. The analysis is based on mostly qualitative indicators, but it will be possible to evaluate most of the indicators quantitively in a future study.Keywords: energy efficiency, building renovation, renovation solutions, building energy performance criteria
Procedia PDF Downloads 9019198 Reading and Writing of Biscriptal Children with and Without Reading Difficulties in Two Alphabetic Scripts
Authors: Baran Johansson
Abstract:
This PhD dissertation aimed to explore children’s writing and reading in L1 (Persian) and L2 (Swedish). It adds new perspectives to reading and writing studies of bilingual biscriptal children with and without reading and writing difficulties (RWD). The study used standardised tests to examine linguistic and cognitive skills related to word reading and writing fluency in both languages. Furthermore, all participants produced two texts (one descriptive and one narrative) in each language. The writing processes and the writing product of these children were explored using logging methodologies (Eye and Pen) for both languages. Furthermore, this study investigated how two bilingual children with RWD presented themselves through writing across their languages. To my knowledge, studies utilizing standardised tests and logging tools to investigate bilingual children’s word reading and writing fluency across two different alphabetic scripts are scarce. There have been few studies analysing how bilingual children construct meaning in their writing, and none have focused on children who write in two different alphabetic scripts or those with RWD. Therefore, some aspects of the systemic functional linguistics (SFL) perspective were employed to examine how two participants with RWD created meaning in their written texts in each language. The results revealed that children with and without RWD had higher writing fluency in all measures (e.g. text lengths, writing speed) in their L2 compared to their L1. Word reading abilities in both languages were found to influence their writing fluency. The findings also showed that bilingual children without reading difficulties performed 1 standard deviation below the mean when reading words in Persian. However, their reading performance in Swedish aligned with the expected age norms, suggesting greater efficient in reading Swedish than in Persian. Furthermore, the results showed that the level of orthographic depth, consistency between graphemes and phonemes, and orthographic features can probably explain these differences across languages. The analysis of meaning-making indicated that the participants with RWD exhibited varying levels of difficulty, which influenced their knowledge and usage of writing across languages. For example, the participant with poor word recognition (PWR) presented himself similarly across genres, irrespective of the language in which he wrote. He employed the listing technique similarly across his L1 and L2. However, the participant with mixed reading difficulties (MRD) had difficulties with both transcription and text production. He produced spelling errors and frequently paused in both languages. He also struggled with word retrieval and producing coherent texts, consistent with studies of monolingual children with poor comprehension or with developmental language disorder. The results suggest that the mother tongue instruction provided to the participants has not been sufficient for them to become balanced biscriptal readers and writers in both languages. Therefore, increasing the number of hours dedicated to mother tongue instruction and motivating the children to participate in these classes could be potential strategies to address this issue.Keywords: reading, writing, reading and writing difficulties, bilingual children, biscriptal
Procedia PDF Downloads 7219197 Uncertainty Quantification of Corrosion Anomaly Length of Oil and Gas Steel Pipelines Based on Inline Inspection and Field Data
Authors: Tammeen Siraj, Wenxing Zhou, Terry Huang, Mohammad Al-Amin
Abstract:
The high resolution inline inspection (ILI) tool is used extensively in the pipeline industry to identify, locate, and measure metal-loss corrosion anomalies on buried oil and gas steel pipelines. Corrosion anomalies may occur singly (i.e. individual anomalies) or as clusters (i.e. a colony of corrosion anomalies). Although the ILI technology has advanced immensely, there are measurement errors associated with the sizes of corrosion anomalies reported by ILI tools due limitations of the tools and associated sizing algorithms, and detection threshold of the tools (i.e. the minimum detectable feature dimension). Quantifying the measurement error in the ILI data is crucial for corrosion management and developing maintenance strategies that satisfy the safety and economic constraints. Studies on the measurement error associated with the length of the corrosion anomalies (in the longitudinal direction of the pipeline) has been scarcely reported in the literature and will be investigated in the present study. Limitations in the ILI tool and clustering process can sometimes cause clustering error, which is defined as the error introduced during the clustering process by including or excluding a single or group of anomalies in or from a cluster. Clustering error has been found to be one of the biggest contributory factors for relatively high uncertainties associated with ILI reported anomaly length. As such, this study focuses on developing a consistent and comprehensive framework to quantify the measurement errors in the ILI-reported anomaly length by comparing the ILI data and corresponding field measurements for individual and clustered corrosion anomalies. The analysis carried out in this study is based on the ILI and field measurement data for a set of anomalies collected from two segments of a buried natural gas pipeline currently in service in Alberta, Canada. Data analyses showed that the measurement error associated with the ILI-reported length of the anomalies without clustering error, denoted as Type I anomalies is markedly less than that for anomalies with clustering error, denoted as Type II anomalies. A methodology employing data mining techniques is further proposed to classify the Type I and Type II anomalies based on the ILI-reported corrosion anomaly information.Keywords: clustered corrosion anomaly, corrosion anomaly assessment, corrosion anomaly length, individual corrosion anomaly, metal-loss corrosion, oil and gas steel pipeline
Procedia PDF Downloads 31019196 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 19619195 Alginate Wrapped NiO-ZnO Nanocomposites-Based Catalyst for the Reduction of Methylene Blue
Authors: Mohamed A. Adam Abakar, Abdullah M. Asiri, Sher Bahadar Khan
Abstract:
In this paper, nickel oxide-zinc oxide (NiO-ZnO) catalyst was embedded in an alginate polymer (Na alg/NiO-ZnO), a nanocomposite that was used as a nano-catalyst for catalytic conversion of deleterious contaminants such as organic dyes (Acridine Orange “ArO”, Methylene Blue “MB”, Methyl Orange “MO”) and 4-Nitrophenol “4-NP” as well. FESEM, EDS, FTIR and XRD techniques were used to identify the shape and structure of the nano-catalyst (Na alg/NiO-ZnO). UV spectrophotometry is used to collect the results and it showed greater and faster reduction rate for MB (illustrated in figures 2, 3, 4 and 5). Data recorded and processed, drawing and analysis of graphs achieved by using Origin 2018. Reduction percentage of MB was assessed to be 95.25 % in just 13 minutes. Furthermore, the catalytic property of Na alg/NiO-ZnO in the reduction of organic dyes was investigated using various catalyst amounts, dye types, reaction times and reducing agent dosages at room temperature (rt). NaBH4-assisted reduction of organic dyes was studied using alg/NiO-ZnO as a potential catalyst.Keywords: Alginate, metal oxides, nanocomposites-based, catalysts, reduction, photocatalytic degradation, water treatment
Procedia PDF Downloads 7219194 Detection of PCD-Related Transcription Factors for Improving Salt Tolerance in Plant
Authors: A. Bahieldin, A. Atef, S. Edris, N. O. Gadalla, S. M. Hassan, M. A. Al-Kordy, A. M. Ramadan, A. S. M. Al- Hajar, F. M. El-Domyati
Abstract:
The idea of this work is based on a natural exciting phenomenon suggesting that suppression of genes related to the program cell death (or PCD) mechanism might help the plant cells to efficiently tolerate abiotic stresses. The scope of this work was the detection of PCD-related transcription factors (TFs) that might also be related to salt stress tolerance in plant. Two model plants, e.g., tobacco and Arabidopsis, were utilized in order to investigate this phenomenon. Occurrence of PCD was first proven by Evans blue staining and DNA laddering after tobacco leaf discs were treated with oxalic acid (OA) treatment (20 mM) for 24 h. A number of 31 TFs up regulated after 2 h and co-expressed with genes harboring PCD-related domains were detected via RNA-Seq analysis and annotation. These TFs were knocked down via virus induced gene silencing (VIGS), an RNA interference (RNAi) approach, and tested for their influence on triggering PCD machinery. Then, Arabidopsis SALK knocked out T-DNA insertion mutants in selected TFs analogs to those in tobacco were tested under salt stress (up to 250 mM NaCl) in order to detect the influence of different TFs on conferring salt tolerance in Arabidopsis. Involvement of a number of candidate abiotic-stress related TFs was investigated.Keywords: VIGS, PCD, RNA-Seq, transcription factors
Procedia PDF Downloads 27419193 A Robust and Adaptive Unscented Kalman Filter for the Air Fine Alignment of the Strapdown Inertial Navigation System/GPS
Authors: Jian Shi, Baoguo Yu, Haonan Jia, Meng Liu, Ping Huang
Abstract:
Adapting to the flexibility of war, a large number of guided weapons launch from aircraft. Therefore, the inertial navigation system loaded in the weapon needs to undergo an alignment process in the air. This article proposes the following methods to the problem of inaccurate modeling of the system under large misalignment angles, the accuracy reduction of filtering caused by outliers, and the noise changes in GPS signals: first, considering the large misalignment errors of Strapdown Inertial Navigation System (SINS)/GPS, a more accurate model is made rather than to make a small-angle approximation, and the Unscented Kalman Filter (UKF) algorithms are used to estimate the state; then, taking into account the impact of GPS noise changes on the fine alignment algorithm, the innovation adaptive filtering algorithm is introduced to estimate the GPS’s noise in real-time; at the same time, in order to improve the anti-interference ability of the air fine alignment algorithm, a robust filtering algorithm based on outlier detection is combined with the air fine alignment algorithm to improve the robustness of the algorithm. The algorithm can improve the alignment accuracy and robustness under interference conditions, which is verified by simulation.Keywords: air alignment, fine alignment, inertial navigation system, integrated navigation system, UKF
Procedia PDF Downloads 17119192 Improving the Weekend Handover in General Surgery: A Quality Improvement Project
Authors: Michael Ward, Eliana Kalakouti, Andrew Alabi
Abstract:
Aim: The handover process is recognized as a vulnerable step in the patient care pathway where errors are likely to occur. As such, it is a major preventable cause of patient harm due to human factors of poor communication and systematic error. The aim of this study was to audit the general surgery department’s weekend handover process compared to the recommended criteria for safe handover as set out by the Royal College of Surgeons (RCS). Method: A retrospective audit of the General Surgery department’s Friday patient lists and patient medical notes used for weekend handover in a London-based District General Hospital (DGH). Medical notes were analyzed against RCS's suggested criteria for handover. A standardized paper weekend handover proforma was then developed in accordance with guidelines and circulated in the department. A post-intervention audit was then conducted using the same methods for cycle 1. For cycle 2, we introduced an electronic weekend handover tool along with Electronic Patient Records (EPR). After a one-month period, a second post-intervention audit was conducted. Results: Following cycle 1, the paper weekend handover proforma was only used in 23% of patient notes. However, when it was used, 100% of them had a plan for the weekend, diagnosis and location but only 40% documented potential discharge status and 40% ceiling of care status. Qualitative feedback was that it was time-consuming to fill out. Better results were achieved following cycle 2, with 100% of patient notes having the electronic proforma. Results improved with every patient having documented ceiling of care, discharge status and location. Only 55% of patients had a past surgical history; however, this was still an increase when compared to paper proforma (45%). When comparing electronic versus paper proforma, there was an increase in documentation in every domain of the handover outlined by RCS with an average relative increase of 1.72 times (p<0.05). Qualitative feedback was that the autofill function made it easy to use and simple to view. Conclusion: These results demonstrate that the implementation of an electronic autofill handover proforma significantly improved handover compliance with RCS guidelines, thereby improving the transmission of information from week-day to weekend teams.Keywords: surgery, handover, proforma, electronic handover, weekend, general surgery
Procedia PDF Downloads 15919191 Biosensor: An Approach towards Sustainable Environment
Authors: Purnima Dhall, Rita Kumar
Abstract:
Introduction: River Yamuna, in the national capital territory (NCT), and also the primary source of drinking water for the city. Delhi discharges about 3,684 MLD of sewage through its 18 drains in to the Yamuna. Water quality monitoring is an important aspect of water management concerning to the pollution control. Public concern and legislation are now a day’s demanding better environmental control. Conventional method for estimating BOD5 has various drawbacks as they are expensive, time-consuming, and require the use of highly trained personnel. Stringent forthcoming regulations on the wastewater have necessitated the urge to develop analytical system, which contribute to greater process efficiency. Biosensors offer the possibility of real time analysis. Methodology: In the present study, a novel rapid method for the determination of biochemical oxygen demand (BOD) has been developed. Using the developed method, the BOD of a sample can be determined within 2 hours as compared to 3-5 days with the standard BOD3-5day assay. Moreover, the test is based on specified consortia instead of undefined seeding material therefore it minimizes the variability among the results. The device is coupled to software which automatically calculates the dilution required, so, the prior dilution of the sample is not required before BOD estimation. The developed BOD-Biosensor makes use of immobilized microorganisms to sense the biochemical oxygen demand of industrial wastewaters having low–moderate–high biodegradability. The method is quick, robust, online and less time consuming. Findings: The results of extensive testing of the developed biosensor on drains demonstrate that the BOD values obtained by the device correlated with conventional BOD values the observed R2 value was 0.995. The reproducibility of the measurements with the BOD biosensor was within a percentage deviation of ±10%. Advantages of developed BOD biosensor • Determines the water pollution quickly in 2 hours of time; • Determines the water pollution of all types of waste water; • Has prolonged shelf life of more than 400 days; • Enhanced repeatability and reproducibility values; • Elimination of COD estimation. Distinctiveness of Technology: • Bio-component: can determine BOD load of all types of waste water; • Immobilization: increased shelf life > 400 days, extended stability and viability; • Software: Reduces manual errors, reduction in estimation time. Conclusion: BiosensorBOD can be used to measure the BOD value of the real wastewater samples. The BOD biosensor showed good reproducibility in the results. This technology is useful in deciding treatment strategies well ahead and so facilitating discharge of properly treated water to common water bodies. The developed technology has been transferred to M/s Forbes Marshall Pvt Ltd, Pune.Keywords: biosensor, biochemical oxygen demand, immobilized, monitoring, Yamuna
Procedia PDF Downloads 27919190 Execution of Optimization Algorithm in Cascaded H-Bridge Multilevel Inverter
Authors: M. Suresh Kumar, K. Ramani
Abstract:
This paper proposed the harmonic elimination of Cascaded H-Bridge Multi-Level Inverter by using Selective Harmonic Elimination-Pulse Width Modulation method programmed with Particle Swarm Optimization algorithm. PSO method determine proficiently the required switching angles to eliminate low order harmonics up to the 11th order from the inverter output voltage waveform while keeping the magnitude of the fundamental harmonics at the desired value. Results demonstrate that the proposed method does efficiently eliminate a great number of specific harmonics and the output voltage is resulted in minimum Total Harmonic Distortion. The results shown that the PSO algorithm attain successfully to the global solution faster than other algorithms.Keywords: multi-level inverter, Selective Harmonic Elimination Pulse Width Modulation (SHEPWM), Particle Swarm Optimization (PSO), Total Harmonic Distortion (THD)
Procedia PDF Downloads 60419189 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators
Authors: K. O'Malley
Abstract:
Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university
Procedia PDF Downloads 34