Search results for: approach to patient
14674 Psychedelic Assisted-Treatment for Patients with Opioid Use Disorder
Authors: Daniele Zullino, Gabriel Thorens, Léonice Furtado, Federico Seragnoli, Radu Iuga, Louise Penzenstadler
Abstract:
Context: Since the start of the 21st century, there has been a resurgence of interest in psychedelics, marked by a renewed focus on scientific investigations into their therapeutic potential. While psychedelic therapy has gained recognition for effectively treating depression and anxiety disorders, notable progress has been made in the clinical development of substances like psilocybin. Moreover, mounting evidence suggests promising applications of Lysergic acid diethylamide (LSD) and psilocybin in the field of addiction medicine. In Switzerland, compassionate treatment with LSD and psilocybin has been permitted since 2014 through exceptional licenses granted by the Federal Office of Public Health. This treatment approach is also available within the Geneva treatment program, extending its accessibility to patients undergoing opioid-assisted treatment involving substances like morphine and diacetylmorphine. The aim of this study is to assess the feasibility of psychedelic-assisted therapy in patients with opioid use disorder who are undergoing opioid-assisted treatment. This study addresses the question of whether psychedelic-assisted therapy can be successfully implemented in patients with opioid use disorder. It also explores the effects of psychedelic therapy on the patient's experiences and outcomes. Methodology: This is an open case series on six patients who have undergone at least one session with either LSD (100-200 micrograms) or psilocybin (20-40 mg). The patients were assessed using the Five Dimensional Altered States of Consciousness (5D-ASC)-Scale. The data were analyzed descriptively to identify patterns and trends in the patients' experiences. Results: The patients experienced substantial positive psychedelic effects during the psychedelic sessions without significant adverse effects. The patients reported positive experiences and improvements in their condition. Conclusion: The findings of this study support the feasibility and potential efficacy of psychedelic-assisted therapy in patients undergoing opioid-assisted treatment.Keywords: psychedelics, psychedelic-assisted treatment, opioid use disorder, addiction, LSD, psilocybin
Procedia PDF Downloads 5514673 Development of PVA/polypyrrole Scaffolds by Supercritical CO₂ for Its Application in Biomedicine
Authors: Antonio Montes, Antonio Cozar, Clara Pereyra, Diego Valor, Enrique Martinez de la Ossa
Abstract:
Tissues and organs can be damaged because of traumatism, congenital illnesses, or cancer and the traditional therapeutic alternatives, such as surgery, cannot usually completely repair the damaged tissues. Tissue engineering allows regeneration of the patient's tissues, reducing the problems caused by the traditional methods. Scaffolds, polymeric structures with interconnected porosity, can be promoted the proliferation and adhesion of the patient’s cells in the damaged area. Furthermore, by means of impregnation of the scaffold with beneficial active substances, tissue regeneration can be induced through a drug delivery process. The objective of the work is the fabrication of a PVA scaffold coated with Gallic Acid and polypyrrole through a one-step foaming and impregnation process using the SSI technique (Supercritical Solvent Impregnation). In this technique, supercritical CO₂ penetrates into the polymer chains producing the plasticization of the polymer. In the depressurization step a CO₂ cellular nucleation and growing to take place to an interconnected porous structure of the polymer. The foaming process using supercritical CO₂ as solvent and expansion agent presents advantages compared to the traditional scaffolds’ fabrication methods, such as the polymer’s high solubility in the solvent or the possibility of carrying out the process at a low temperature, avoiding the inactivation of the active substance. In this sense, the supercritical CO₂ avoids the use of organic solvents and reduces the solvent residues in the final product. Moreover, this process does not require long processing time that could cause the stratification of substance inside the scaffold reducing the therapeutic efficiency of the formulation. An experimental design has been carried out to optimize the SSI technique operating conditions, as well as a study of the morphological characteristics of the scaffold for its use in tissue engineerings, such as porosity, conductivity or the release profiles of the active substance. It has been proved that the obtained scaffolds are partially porous, conductors of electricity and are able to release Gallic Acid in the long term.Keywords: scaffold, foaming, supercritical, PVA, polypyrrole, gallic acid
Procedia PDF Downloads 18214672 Implementation of a Lattice Boltzmann Method for Pulsatile Flow with Moment Based Boundary Condition
Authors: Zainab A. Bu Sinnah, David I. Graham
Abstract:
The Lattice Boltzmann Method has been developed and used to simulate both steady and unsteady fluid flow problems such as turbulent flows, multiphase flow and flows in the vascular system. As an example, the study of blood flow and its properties can give a greater understanding of atherosclerosis and the flow parameters which influence this phenomenon. The blood flow in the vascular system is driven by a pulsating pressure gradient which is produced by the heart. As a very simple model of this, we simulate plane channel flow under periodic forcing. This pulsatile flow is essentially the standard Poiseuille flow except that the flow is driven by the periodic forcing term. Moment boundary conditions, where various moments of the particle distribution function are specified, are applied at solid walls. We used a second-order single relaxation time model and investigated grid convergence using two distinct approaches. In the first approach, we fixed both Reynolds and Womersley numbers and varied relaxation time with grid size. In the second approach, we fixed the Womersley number and relaxation time. The expected second-order convergence was obtained for the second approach. For the first approach, however, the numerical method converged, but not necessarily to the appropriate analytical result. An explanation is given for these observations.Keywords: Lattice Boltzmann method, single relaxation time, pulsatile flow, moment based boundary condition
Procedia PDF Downloads 23114671 A Socio-Technical Approach to Cyber-Risk Assessment
Authors: Kitty Kioskli, Nineta Polemi
Abstract:
Evaluating the levels of cyber-security risks within an enterprise is most important in protecting its information system, services and all its digital assets against security incidents (e.g. accidents, malicious acts, massive cyber-attacks). The existing risk assessment methodologies (e.g. eBIOS, OCTAVE, CRAMM, NIST-800) adopt a technical approach considering as attack factors only the capability, intention and target of the attacker, and not paying attention to the attacker’s psychological profile and personality traits. In this paper, a socio-technical approach is proposed in cyber risk assessment, in order to achieve more realistic risk estimates by considering the personality traits of the attackers. In particular, based upon principles from investigative psychology and behavioural science, a multi-dimensional, extended, quantifiable model for an attacker’s profile is developed, which becomes an additional factor in the cyber risk level calculation.Keywords: attacker, behavioural models, cyber risk assessment, cybersecurity, human factors, investigative psychology, ISO27001, ISO27005
Procedia PDF Downloads 16514670 Management of Femoral Neck Stress Fractures at a Specialist Centre and Predictive Factors to Return to Activity Time: An Audit
Authors: Charlotte K. Lee, Henrique R. N. Aguiar, Ralph Smith, James Baldock, Sam Botchey
Abstract:
Background: Femoral neck stress fractures (FNSF) are uncommon, making up 1 to 7.2% of stress fractures in healthy subjects. FNSFs are prevalent in young women, military recruits, endurance athletes, and individuals with energy deficiency syndrome or female athlete triad. Presentation is often non-specific and is often misdiagnosed following the initial examination. There is limited research addressing the return–to–activity time after FNSF. Previous studies have demonstrated prognostic time predictions based on various imaging techniques. Here, (1) OxSport clinic FNSF practice standards are retrospectively reviewed, (2) FNSF cohort demographics are examined, (3) Regression models were used to predict return–to–activity prognosis and consequently determine bone stress risk factors. Methods: Patients with a diagnosis of FNSF attending Oxsport clinic between 01/06/2020 and 01/01/2020 were selected from the Rheumatology Assessment Database Innovation in Oxford (RhADiOn) and OxSport Stress Fracture Database (n = 14). (1) Clinical practice was audited against five criteria based on local and National Institute for Health Care Excellence guidance, with a 100% standard. (2) Demographics of the FNSF cohort were examined with Student’s T-Test. (3) Lastly, linear regression and Random Forest regression models were used on this patient cohort to predict return–to–activity time. Consequently, an analysis of feature importance was conducted after fitting each model. Results: OxSport clinical practice met standard (100%) in 3/5 criteria. The criteria not met were patient waiting times and documentation of all bone stress risk factors. Importantly, analysis of patient demographics showed that of the population with complete bone stress risk factor assessments, 53% were positive for modifiable bone stress risk factors. Lastly, linear regression analysis was utilized to identify demographic factors that predicted return–to–activity time [R2 = 79.172%; average error 0.226]. This analysis identified four key variables that predicted return-to-activity time: vitamin D level, total hip DEXA T value, femoral neck DEXA T value, and history of an eating disorder/disordered eating. Furthermore, random forest regression models were employed for this task [R2 = 97.805%; average error 0.024]. Analysis of the importance of each feature again identified a set of 4 variables, 3 of which matched with the linear regression analysis (vitamin D level, total hip DEXA T value, and femoral neck DEXA T value) and the fourth: age. Conclusion: OxSport clinical practice could be improved by more comprehensively evaluating bone stress risk factors. The importance of this evaluation is demonstrated by the population found positive for these risk factors. Using this cohort, potential bone stress risk factors that significantly impacted return-to-activity prognosis were predicted using regression models.Keywords: eating disorder, bone stress risk factor, femoral neck stress fracture, vitamin D
Procedia PDF Downloads 18314669 Promoting Civic Health through Patient Voter Registration
Authors: Amit Syal, Madeline Grade, Alister Martin
Abstract:
Background: Cross-sectional and longitudinal studies demonstrate an association between health and voting. Furthermore, voting enables populations to support policies that impact their health via social determinants like income, education, housing, and healthcare access. Unfortunately, many barriers exist which disproportionately affect the civic participation of certain minority groups. Health professionals have an important role to play in addressing the civic health of all patients and empowering underrepresented communities. Description: Vot-ER is a non-partisan, nonprofit organization that aims to reduce barriers to civic participation by helping patients register to vote while in healthcare settings. The initial approach involved iPad-based kiosks in the emergency department waiting rooms, allowing patients to register themselves while waiting. After the COVID-19 pandemic began, Vot-ER expanded its touchless digital approaches. Vot-ER provides healthcare workers across the country with “Healthy Democracy Kits” consisting of badge backers, posters, discharge paperwork, and other resources. These contain QR and text codes that direct users to an online platform for registering to vote or requesting a mail-in ballot, available in English or Spanish. Outcomes: From May to November 2020, Vot-ER helped prepare 46,320 people to vote. 13,192 individual healthcare providers across all 50 states signed up for and received Healthy Democracy Kits. 80 medical schools participated in the Healthy Democracy Campaign competition. Over 500 institutions ordered site-based materials. Conclusions: A healthy democracy is one in which all individuals in a community have equal and fair opportunities for their voices to be heard. Healthcare settings, such as hospitals, are appropriate and effective venues for increasing both voter registration and education.Keywords: civic health, enfranchisement, physician, voting
Procedia PDF Downloads 18214668 A Non-parametric Clustering Approach for Multivariate Geostatistical Data
Authors: Francky Fouedjio
Abstract:
Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.Keywords: clustering, geostatistics, multivariate data, non-parametric
Procedia PDF Downloads 47714667 The Significance of a Well-Defined Systematic Approach in Risk Management for Construction Projects within Oil Industry
Authors: Batool Ismaeel, Umair Farooq, Saad Mushtaq
Abstract:
Construction projects in the oil industry can be very complex, having unknown outcomes and uncertainties that cannot be easily predicted. Each project has its unique risks generated by a number of factors which, if not controlled, will impact the successful completion of the project mainly in terms of schedule, cost, quality, and safety. This paper highlights the historic risks associated with projects in the south and east region of Kuwait Oil Company (KOC) collated from the company’s lessons learned database. Starting from Contract Award through to handover of the project to the Asset owner, the gaps in project execution in terms of managing risk will be brought to discussion and where a well-defined systematic approach in project risk management reflecting many claims, change of scope, exceeding budget, delays in engineering phase as well as in the procurement and fabrication of long lead items should be adopted. This study focuses on a proposed feasible approach in risk management for engineering, procurement and construction (EPC) level projects including the various stakeholders involved in executing the works from International to local contractors and vendors in KOC. The proposed approach covers the areas categorized into organizational, design, procurement, construction, pre-commissioning, commissioning and project management in which the risks are identified and require management and mitigation. With the effective deployment and implementation of the proposed risk management system and the consideration of it as a vital key in achieving the project’s target, the outcomes will be more predictable in the future, and the risk triggers will be managed and controlled. The correct resources can be allocated on a timely basis for the company for avoiding any unpredictable outcomes during the execution of the project. It is recommended in this paper to apply this risk management approach as an integral part of project management and investigate further in the future, the effectiveness of this proposed system for newly awarded projects and compare the same with those projects of similar budget/complexity that have not applied this approach to risk management.Keywords: construction, project completion, risk management, uncertainties
Procedia PDF Downloads 15314666 Supporting Women's Economic Development in Rural Papua New Guinea
Authors: Katja Mikhailovich, Barbara Pamphilon
Abstract:
Farmer training in Papua New Guinea has focused mainly on technology transfer approaches. This has primarily benefited men and often excluded women whose literacy, low education and role in subsistence crops has precluded participation in formal training. The paper discusses an approach that uses both a brokerage model of agricultural extension to link smallholders with private sector agencies and an innovative family team’s approach that aims to support the economic empowerment of women in families and encourages sustainable and gender equitable farming and business practices.Keywords: women, economic development, agriculture, training
Procedia PDF Downloads 39114665 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 15314664 Factors Affecting the Fear of Insulin Injection and Finger Punching in Individuals Diagnosed with Diabetes
Authors: Gaye Demi̇rtaş Adli
Abstract:
Research: It was conducted to determine the factors affecting the fear of self-injection and self-pricking of fingers of diabetic individuals.The study was conducted as a cross-sectional, relation-seeking, and descriptive study. The study was conducted on 122 patients who had just started insulin therapy. Data were obtained through The Descriptive Patient Form, The Diabetic Self-Injection, and the Fear of Testing Questionnaire Form (D-FISQ). Descriptive statistical methods used in the evaluation of data are the Mann-Whitney U test, Kruskal-Wallis H test, and the Spearman correlation. The factors affecting the scale scores were evaluated with multiple linear regression analysis. The value of P<0.05 was considered statistically significant. Study group: 56.6% of the patients are male patients. Fear of self-injection (injection), fear of self-testing (test), and total fear (total) scores of women were found to be statistically higher than men (p<0.001). Age, gender, and pain experience were important variables that affected patients' fear of injections. With a one-unit increase in age, the injection fear score decreased by 0.13 points, and the mean injection fear score of women was 2.11 points higher than that of men. It was determined that the patient's age, gender, living with whom, and blood donation history were important variables affecting the fear of self-testing. It is seen that the fear test score decreases by 0.18 points with an increase in age by one unit, and the fear test scores of women compared to men are on average 3,358 points, the fear test scores of those living alone are 4,711 points compared to those living with family members, and the fear test scores of those who do not donate blood are 2,572 compared to those who donate blood score, it was determined that those with more pain experience were 3,156 points higher on average than those with less fear of injection. As a result, it was seen that the most important factors affecting the fear of insulin injection and finger punching in individuals with diabetes were age, gender, pain experience, living with whom, and blood donation history.Keywords: diabetes, needle phobia, fear of injection, insulin injection
Procedia PDF Downloads 7114663 Multiscale Modeling of Damage in Textile Composites
Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese
Abstract:
Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites
Procedia PDF Downloads 35414662 Neuroanatomical Specificity in Reporting & Diagnosing Neurolinguistic Disorders: A Functional & Ethical Primer
Authors: Ruairi J. McMillan
Abstract:
Introduction: This critical analysis aims to ascertain how well neuroanatomical aetiologies are communicated within 20 case reports of aphasia. Neuroanatomical visualisations based on dissected brain specimens were produced and combined with white matter tract and vascular taxonomies of function in order to address the most consistently underreported features found within the aphasic case study reports. Together, these approaches are intended to integrate aphasiological knowledge from the past 20 years with aphasiological diagnostics, and to act as prototypal resources for both researchers and clinical professionals. The medico-legal precedent for aphasia diagnostics under Canadian, US and UK case law and the neuroimaging/neurological diagnostics relative to the functional capacity of aphasic patients are discussed in relation to the major findings of the literary analysis, neuroimaging protocols in clinical use today, and the neuroanatomical aetiologies of different aphasias. Basic Methodology: Literature searches of relevant scientific databases (e.g, OVID medline) were carried out using search terms such as aphasia case study (year) & stroke induced aphasia case study. A series of 7 diagnostic reporting criteria were formulated, and the resulting case studies were scored / 7 alongside clinical stroke criteria. In order to focus on the diagnostic assessment of the patient’s condition, only the case report proper (not the discussion) was used to quantify results. Statistical testing established if specific reporting criteria were associated with higher overall scores and potentially inferable increases in quality of reporting. Statistical testing of whether criteria scores were associated with an unclear/adjusted diagnosis were also tested, as well as the probability of a given criterion deviating from an expected estimate. Major Findings: The quantitative analysis of neuroanatomically driven diagnostics in case studies of aphasia revealed particularly low scores in the connection of neuroanatomical functions to aphasiological assessment (10%), and in the inclusion of white matter tracts within neuroimaging or assessment diagnostics (30%). Case studies which included clinical mention of white matter tracts within the report itself were distributed among higher scoring cases, as were case studies which (as clinically indicated) related the affected vascular region to the brain parenchyma of the language network. Concluding Statement: These findings indicate that certain neuroanatomical functions are integrated less often within the patient report than others, despite a precedent for well-integrated neuroanatomical aphasiology also being found among the case studies sampled, and despite these functions being clinically essential in diagnostic neuroimaging and aphasiological assessment. Therefore, ultimately the integration and specificity of aetiological neuroanatomy may contribute positively to the capacity and autonomy of aphasic patients as well as their clinicians. The integration of a full aetiological neuroanatomy within the reporting of aphasias may improve patient outcomes and sustain autonomy in the event of medico-ethical investigation.Keywords: aphasia, language network, functional neuroanatomy, aphasiological diagnostics, medico-legal ethics
Procedia PDF Downloads 6714661 Information and Cooperativity in Fiction: The Pragmatics of David Baboulene’s “Knowledge Gaps”
Authors: Cara DiGirolamo
Abstract:
In his 2017 Ph.D. thesis, script doctor David Baboulene presented a theory of fiction in which differences in the knowledge states between participants in a literary experience, including reader, author, and characters, create many story elements, among them suspense, expectations, subtext, theme, metaphor, and allegory. This theory can be adjusted and modeled by incorporating a formal pragmatic approach that understands narrative as a speech act with a conversational function. This approach requires both the Speaker and the Listener to be understood as participants in the discourse. It also uses theories of cooperativity and the QUD to identify the existence of implicit questions. This approach predicts that what an effective literary narrative must do: provide a conversational context early in the story so the reader can engage with the text as a conversational participant. In addition, this model incorporates schema theory. Schema theory is a cognitive model for learning and processing information about the world and transforming it into functional knowledge. Using this approach can extend the QUD model. Instead of describing conversation as a form of information gathering restricted to question-answer sets, the QUD can include knowledge modeling and understanding as a possible outcome of a conversation. With this model, Baboulene’s “Knowledge Gaps” can provide real insight into storytelling as a conversational move, and extend the QUD to be able to simply and effectively apply to a more diverse set of conversational interactions and also to narrative texts.Keywords: literature, speech acts, QUD, literary theory
Procedia PDF Downloads 214660 Assessment, Diagnosis and Treatment, Simulation for the Nurse Practitioner Student
Authors: Helen Coronel, Will Brewer, Peggy Bergeron, Clarissa Hall, Victoria Casson
Abstract:
Simulation-based training provides the nurse practitioner (NP) student with a safe and controlled environment in which they can practice a real-life scenario. This type of learning fosters critical thinking skills essential to practice. The expectation of this study was that students would have an increase in their competency and confidence after performing the simulation. Approximately 8.4% of Americans suffer from depression. The state of Alabama is ranked 47th out of 50 for access to mental health care. As a result of this significant shortage of mental health providers, primary care providers are frequently put in the position of screening for and treating mental health conditions, such as depression. Family nurse practitioners are often utilized as primary care providers, making their ability to assess, diagnose and treat these disorders a necessary skill. The expected outcome of this simulation is an increase in confidence, competency and the empowerment of the nurse practitioner student’s ability to assess, diagnose and treat a common mood disorder they may encounter in practice. The Kirkpatrick Module was applied for this study. A non-experimental design using descriptive statistical analysis was utilized. The simulation was based on a common psychiatric mood disorder frequently observed in primary care and mental health clinics. Students were asked to watch a voiceover power point presentation prior to their on-campus simulation. The presentation included training on the assessment, diagnosis, and treatment of a patient with depression. Prior to the simulation, the students completed a pre-test, then participated in the simulation, and completed a post-test when done. Apple iPads were utilized to access a simulated health record. Major findings of the study support an increase in students’ competency and confidence when assessing, diagnosing, and treating an adult patient with depression.Keywords: advanced practice, nurse practitioner, simulation, primary care, depression
Procedia PDF Downloads 9614659 Big Classes, Bigger Ambitions: A Participatory Approach to the Multiple-Choice Exam
Authors: Melanie Adrian, Elspeth McCulloch, Emily-Jean Gallant
Abstract:
Resources -financial, physical, and human- are increasingly constrained in higher education. University classes are getting bigger, and the concomitant grading burden on faculty is growing rapidly. Multiple-choice exams are seen by some as one solution to these changes. How much students retain, however, and what their testing experience is, continues to be debated. Are multiple-choice exams serving students well, or are they bearing the burden of these developments? Is there a way to address both the resource constraints and make these types of exams more meaningful? In short, how do we engender evaluation methods for large-scale classes that provide opportunities for heightened student learning and enrichment? The following article lays out a testing approach we have employed in four iterations of the same third-year law class. We base our comments in this paper on our initial observations as well as data gathered from an ethics-approved study looking at student experiences. This testing approach provides students with multiple opportunities for revision (thus increasing chances for long term retention), is both individually and collaboratively driven (thus reflecting the individual effort and group effort) and is automatically graded (thus draining limited institutional resources). We found that overall students appreciated the approach and found it more ‘humane’, that it notably reduced pre-exam and intra-exam stress levels, increased ease, and lowered nervousness.Keywords: exam, higher education, multiple-choice, law
Procedia PDF Downloads 12814658 25 Years of the Neurolinguistic Approach: Origin, Outcomes, Expansion and Current Experiments
Authors: Steeve Mercier, Joan Netten, Olivier Massé
Abstract:
The traditional lack of success of most Canadian students in the regular French program in attaining the ability to communicate spontaneously led to the conceptualization of a modified program. This program, called Intensive French, introduced and evaluated as an experiment in several school districts, formed the basis for the creation of a more effective approach for the development of skills in a second/foreign language and literacy: the Neurolinguistic Approach (NLA).The NLA expresses the major change in the understanding of how communication skills are developed: learning to communicate spontaneously in a second language depends on the reuse of structures in a variety of cognitive situations to express authentic messages rather than on knowledge of the way a language functions. Put differently, it prioritises the acquisition of implicit competence over the learning of grammatical knowledge. This is achieved by the adoption of a literacy-based approach and an increase in intensity of instruction.Besides having strong support empirically from numerous experiments, the NLA has sound theoretical foundation, as it conforms to research in neurolinguistics. The five pedagogical principles that define the approach will be explained, as well as the differences between the NLA and the paradigm on which most current resources and teaching strategies are based. It is now 25 years since the original research occurred. The use of the NLA, as it will be shown, has expanded widely. With some adaptations, it is used for other languages and in other milieus. In Canada, classes are offered in mandarin, Ukrainian, Spanish and Arabic, amongst others. It has also been used in several indigenous communities, such as to restore the use of Mohawk, Cri and Dene. Its use has expanded throughout the world, as in China, Japan, France, Germany, Belgium, Poland, Russia, as well as Mexico. The Intensive French program originally focussed on students in grades 5 or 6 (ages 10 -12); nowadays, the programs based on the approach include adults, particularly immigrants entering new countries. With the increasing interest in inclusion and cultural diversity, there is a demand for language learning amongst pre-school and primary children that can be successfully addressed by the NLA. Other current experiments target trilingual schools and work with Inuit communities of Nunavik in the province of Quebec.Keywords: neuroeducation, neurolinguistic approach, literacy, second language acquisition, plurilingualism, foreign language teaching and learning
Procedia PDF Downloads 7314657 Whether Chaos Theory Could Reconstruct the Ancient Societies
Authors: Zahra Kouzehgari
Abstract:
Since the early emergence of chaos theory in the 1970s in mathematics and physical science, it has increasingly been developed and adapted in social sciences as well. The non-linear and dynamic characteristics of the theory make it a useful conceptual framework to interpret the complex social systems behavior. Regarding chaotic approach principals, sensitivity to initial conditions, dynamic adoption, strange attractors and unpredictability this paper aims to examine whether chaos approach could interpret the ancient social changes. To do this, at first, a brief history of the chaos theory, its development and application in social science as well as the principals making the theory, then its application in archaeological since has been reviewed. The study demonstrates that although based on existing archaeological records reconstruct the whole social system of the human past, the non-linear approaches in studying social complex systems would be of a great help in finding general order of the ancient societies and would enable us to shed light on some of the social phenomena in the human history or to make sense of them.Keywords: archaeology, non-linear approach, chaos theory, ancient social systems
Procedia PDF Downloads 28314656 Physics Informed Deep Residual Networks Based Type-A Aortic Dissection Prediction
Abstract:
Purpose: Acute Type A aortic dissection is a well-known cause of extremely high mortality rate. A highly accurate and cost-effective non-invasive predictor is critically needed so that the patient can be treated at earlier stage. Although various CFD approaches have been tried to establish some prediction frameworks, they are sensitive to uncertainty in both image segmentation and boundary conditions. Tedious pre-processing and demanding calibration procedures requirement further compound the issue, thus hampering their clinical applicability. Using the latest physics informed deep learning methods to establish an accurate and cost-effective predictor framework are amongst the main goals for a better Type A aortic dissection treatment. Methods: Via training a novel physics-informed deep residual network, with non-invasive 4D MRI displacement vectors as inputs, the trained model can cost-effectively calculate all these biomarkers: aortic blood pressure, WSS, and OSI, which are used to predict potential type A aortic dissection to avoid the high mortality events down the road. Results: The proposed deep learning method has been successfully trained and tested with both synthetic 3D aneurysm dataset and a clinical dataset in the aortic dissection context using Google colab environment. In both cases, the model has generated aortic blood pressure, WSS, and OSI results matching the expected patient’s health status. Conclusion: The proposed novel physics-informed deep residual network shows great potential to create a cost-effective, non-invasive predictor framework. Additional physics-based de-noising algorithm will be added to make the model more robust to clinical data noises. Further studies will be conducted in collaboration with big institutions such as Cleveland Clinic with more clinical samples to further improve the model’s clinical applicability.Keywords: type-a aortic dissection, deep residual networks, blood flow modeling, data-driven modeling, non-invasive diagnostics, deep learning, artificial intelligence.
Procedia PDF Downloads 8914655 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology
Procedia PDF Downloads 24214654 Optimization of the Numerical Fracture Mechanics
Authors: H. Hentati, R. Abdelmoula, Li Jia, A. Maalej
Abstract:
In this work, we present numerical simulations of the quasi-static crack propagation based on the variation approach. We perform numerical simulations of a piece of brittle material without initial crack. An alternate minimization algorithm is used. Based on these numerical results, we determine the influence of numerical parameters on the location of crack. We show the importance of trying to optimize the time of numerical computation and we present the first attempt to develop a simple numerical method to optimize this time.Keywords: fracture mechanics, optimization, variation approach, mechanic
Procedia PDF Downloads 60614653 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24414652 Management of First Trimester Miscarriage
Authors: Madeleine Cox
Abstract:
Objective; analyse patient choices in management of first trimester miscarriage, rates of complications including repeat procedure. Design: all first trimester miscarriages from a tertiary institution on the Gold Coast in a 6 month time frame (July to December 2021) were reviewed, including choice of management, histopathology, any representations or admissions, and potential complications. Results: a total of 224 first trimester miscarriages were identified. Of these, 183 (81%) opted to have surgical management in the first instance. Of the remaining patients, 18 (8%) opted to have medical management, and 28 (12.5%) opted to have expectant management. In total, 33(15%) patients required a repeat treatment for retained products. 1 had medical management for a small volume PROC post suction curette. A significant number of these patients initially opted for medical management but then elected to have shorter follow up than usual and went on to have retained products noted. 5 women who had small volumes of RPOC post medical or surgical management had repeat suction curette, however, had very small volumes of products on scan and on curette and may have had a good result with repeated misoprostol administration. It is important to note that whilst a common procedure, suction curettes are not without risk. 2 women had significant blood loss of 1L and 1.5L. A third women had a uterine perforation, a rare but recognised complication, she went on to require a laparoscopy which identified a small serosal bowel injury which was closed by the colorectal team. Conclusion: Management of first trimester miscarriage should be guided by patient preference. It is important to be able to provide patients with their choice of management, however, it is also important to have a good understanding of the risks of each management choice, chances of repeated procedure, appropriate time frame for follow up. Women who choose to undertake medical or expectant management should be supported through this time, with appropriate time frame between taking misoprostol and repeat scan so that the true effects can be evaluated. Patients returning for scans within 2-3 days are more likely to be booked for further surgery, however, may reflect patients who did not have adequate counselling or simply changed their mind on their preferred management options.Keywords: miscarriage, gynaecology, obstetrics, first trimester
Procedia PDF Downloads 10114651 Clinical Trial of VEUPLEXᵀᴹ TBI Assay to Help Diagnose Traumatic Brain Injury by Quantifying Glial Fibrillary Acidic Protein and Ubiquitin Carboxy-Terminal Hydrolase L1 in the Serum of Patients Suspected of Mild TBI by Fluorescence Immunoassay
Authors: Moon Jung Kim, Guil Rhim
Abstract:
The clinical sensitivity of the “VEUPLEXTM TBI assay”, a clinical trial medical device, in mild traumatic brain injury was 28.6% (95% CI, 19.7%-37.5%), and the clinical specificity was 94.0% (95% CI, 89.3%). -98.7%). In addition, when the results analyzed by marker were put together, the sensitivity was higher when interpreting the two tests together than the two tests, UCHL1 and GFAP alone. Additionally, when sensitivity and specificity were analyzed based on CT results for the mild traumatic brain injury patient group, the clinical sensitivity for 2 CT-positive cases was 50.0% (95% CI: 1.3%-98.7%), and 19 CT-negative cases. The clinical specificity for cases was 68.4% (95% CI: 43.5% - 87.4%). Since the low clinical sensitivity for the two CT-positive cases was not statistically significant due to the small number of samples analyzed, it was judged necessary to secure and analyze more samples in the future. Regarding the clinical specificity analysis results for 19 CT-negative cases, there were a large number of patients who were actually clinically diagnosed with mild traumatic brain injury but actually received a CT-negative result, and about 31.6% of them showed abnormal results on VEUPLEXTM TBI assay. Although traumatic brain injury could not be detected in 31.6% of the CT scans, the possibility of actually suffering a mild brain injury could not be ruled out, so it was judged that this could be confirmed through follow-up observation of the patient. In addition, among patients with mild traumatic brain injury, CT examinations were not performed in many cases because the symptoms were very mild, but among these patients, about 25% or more showed abnormal results in the VEUPLEXTM TBI assay. In fact, no damage is observed with the naked eye immediately after traumatic brain injury, and traumatic brain injury is not observed even on CT. But in some cases, brain hemorrhage may occur (delayed cerebral hemorrhage) after a certain period of time, so the patients who did show abnormal results on VEUPLEXTM TBI assay should be followed up for the delayed cerebral hemorrhage. In conclusion, it was judged that it was difficult to judge mild traumatic brain injury with the VEUPLEXTM TBI assay only through clinical findings without CT results, that is, based on the GCS value. Even in the case of CT, it does not detect all mild traumatic brain injury, so it is difficult to necessarily judge that there is no traumatic brain injury, even if there is no evidence of traumatic brain injury in CT. And in the long term, more patients should be included to evaluate the usefulness of the VEUPLEXTM TBI assay in the detection of microscopic traumatic brain injuries without using CT.Keywords: brain injury, traumatic brain injury, GFAP, UCHL1
Procedia PDF Downloads 9914650 Characteristic Function in Estimation of Probability Distribution Moments
Authors: Vladimir S. Timofeev
Abstract:
In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation
Procedia PDF Downloads 50414649 The Impact of Female Education on Fertility: A Natural Experiment from Egypt
Authors: Fatma Romeh, Shiferaw Gurmu
Abstract:
This paper examines the impact of female education on fertility, using the change in length of primary schooling in Egypt in 1988-89 as the source of exogenous variation in schooling. In particular, beginning in 1988, children had to attend primary school for only five years rather than six years. This change was applicable to all individuals born on or after October 1977. Using a nonparametric regression discontinuity approach, we compare education and fertility of women born just before and after October 1977. The results show that female education significantly reduces the number of children born per woman and delays the time until first birth. Applying a robust regression discontinuity approach, however, the impact of education on the number of children is no longer significant. The impact on the timing of first birth remained significant under the robust approach. Each year of female education postponed childbearing by three months, on average.Keywords: Egypt, female education, fertility, robust regression discontinuity
Procedia PDF Downloads 33814648 A Case-Series Analysis of Tuberculosis in Patients at Internal Medicine Department
Authors: Cherif Y., Ghariani R., Derbal S., Farhati S., Ben Dahmen F., Abdallah M.
Abstract:
Introduction: Tuberculosis (TBC) is a frequent infection and is still a major public health problem in Tunisia. The aim of this work is to focus on diagnostic and therapeutic characteristics of TBC in patients referred to our internal medicine department. Patients and Methods: The study was retrospective and descriptive of a cohort of consecutive cases treated from January 2016 to December 2019, collecting patients with latent or patent TBC. Twenty-eight medical records of adults diagnosed with TBC were reviewed. Results: Twenty-eight patients, including 18 women and 10 men, were diagnosed with TBC. Their mean age is 48 years (range: 22-78 years). Five patients have a medical history of diabetes mellitus, 1 patient was followed for systemic lupus erythematosus treated with corticosteroids and immunosuppressant drugs, and another was treated with corticosteroids for Mac Duffy syndrome. The TBC is latent in 12 cases and patent in 16 cases. The most common symptoms were fever and weight loss and were found in 10 cases, a cough in 2 cases, sputum in 3 cases, lymph nodes in 4 cases, erythema nodosum in 2 cases, and neurological signs in 3 cases. Lymphopenia is noticed in 3 cases and a biological inflammatory syndrome in 18 of the cases. The purified protein derivate reaction was positive in 17 cases, anergic in 3 cases, negative in 5 cases, and not done in 3 cases. The acid-fast bacilli stain culture was strongly positive in one patient. The histopathological study was conclusive in 11 patients and showed granulomatosis with caseous necrosis. TBC was pulmonary in 7 patients, lymph node in 7 cases, peritoneal in 7 cases, digestive in 1 case, neuromeningeal in 3 cases, and thyroïd in 1 case. Seven patients had multifocal TBC. All the patients received anti-tuberculosis treatment with a mean duration of 8 months with no failure or relapse with an average follow-up time of 10.58 months. Conclusion: Diagnosis and management of TBC remain essential to avoid serious complications. The survey is necessary to ensure timely detection and treatment of infected adults to decrease its incidence. The best treatment remains preventive through vaccination and improving social and economic conditions.Keywords: tuberculosis, infection, autoimmune disease, granulomatosis
Procedia PDF Downloads 18514647 Synergistic Behavior of Polymer Mixtures in Designing Hydrogels for Biomedical Applications
Authors: Maria Bercea, Monica Diana Olteanu
Abstract:
Investigation of polymer systems able to change inside of the body into networks represent an attractive approach, especially when there is a minimally invasive and patient friendly administration. Pharmaceutical formulations based on Pluronic F127 [poly (oxyethylene) (PEO) blocks (70%) and poly(oxypropylene) (PPO) blocks (30%)] present an excellent potential as drug delivery systems. The use of Pluronic F127 alone as gel-forming solution is limited by some characteristics, such as poor mechanical properties, short residence time, high permeability, etc. Investigation of the interactions between the natural and synthetic polymers and surfactants in solution is a subject of great interest from both scientific and practical point of view. As for example, formulations based on Pluronics and chitosan could be used to obtain dual phase transition hydrogels responsive to temperature and pH changes. In this study, different materials were prepared by using poly(vinyl alcohol), chitosan solutions mixed with aqueous solutions of Pluronic F127. The rheological properties of different formulations were investigated in temperature sweep experiments as well as at a constant temperature of 37oC for exploring in-situ gel formation in the human body conditions. In addition, some viscometric investigations were carried out in order to understand the interactions which determine the complex behaviour of these systems. Correlation between the thermodynamic and rheological parameters and phase separation phenomena observed for the investigated systems allowed the dissemination the constitutive response of polymeric materials at different external stimuli, such as temperature and pH. The rheological investigation demonstrated that the viscoelastic moduli of the hydrogels can be tuned depending on concentration of different components as well as pH and temperature conditions and cumulative contributions can be obtained.Keywords: hydrogel, polymer mixture, stimuli responsive, biomedical applications
Procedia PDF Downloads 34914646 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials
Authors: Claire Williams
Abstract:
Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials
Procedia PDF Downloads 8414645 Fuzzy Logic Modeling of Evaluation the Urban Skylines by the Entropy Approach
Authors: Murat Oral, Seda Bostancı, Sadık Ata, Kevser Dincer
Abstract:
When evaluating the aesthetics of cities, an analysis of the urban form development depending on design properties with a variety of factors is performed together with a study of the effects of this appearance on human beings. Different methods are used while making an aesthetical evaluation related to a city. Entropy, in its preliminary meaning, is the mathematical representation of thermodynamic results. Measuring the entropy is related to the distribution of positional figures of a message or information from the probabilities standpoint. In this study, analysis of evaluation the urban skylines by the entropy approach was modelled with Rule-Based Mamdani-Type Fuzzy (RBMTF) modelling technique. Input-output parameters were described by RBMTF if-then rules. Numerical parameters of input and output variables were fuzzificated as linguistic variables: Very Very Low (L1), Very Low (L2), Low (L3), Negative Medium (L4), Medium (L5), Positive Medium (L6), High (L7), Very High (L8) and Very Very High (L9) linguistic classes. The comparison between application data and RBMTF is done by using absolute fraction of variance (R2). The actual values and RBMTF results indicated that RBMTF can be successfully used for the analysis of evaluation the urban skylines by the entropy approach. As a result, RBMTF model has shown satisfying relation with experimental results, which suggests an alternative method to evaluation of the urban skylines by the entropy approach.Keywords: urban skylines, entropy, rule-based Mamdani type, fuzzy logic
Procedia PDF Downloads 290