Search results for: Nicholas Weber
69 Vibro-Tactile Equalizer for Musical Energy-Valence Categorization
Authors: Dhanya Nair, Nicholas Mirchandani
Abstract:
Musical haptic systems can enhance a listener’s musical experience while providing an alternative platform for the hearing impaired to experience music. Current music tactile technologies focus on representing tactile metronomes to synchronize performers or encoding musical notes into distinguishable (albeit distracting) tactile patterns. There is growing interest in the development of musical haptic systems to augment the auditory experience, although the haptic-music relationship is still not well understood. This paper represents a tactile music interface that provides vibrations to multiple fingertips in synchronicity with auditory music. Like an audio equalizer, different frequency bands are filtered out, and the power in each frequency band is computed and converted to a corresponding vibrational strength. These vibrations are felt on different fingertips, each corresponding to a different frequency band. Songs with music from different spectrums, as classified by their energy and valence, were used to test the effectiveness of the system and to understand the relationship between music and tactile sensations. Three participants were trained on one song categorized as sad (low energy and low valence score) and one song categorized as happy (high energy and high valence score). They were trained both with and without auditory feedback (listening to the song while experiencing the tactile music on their fingertips and then experiencing the vibrations alone without the music). The participants were then tested on three songs from both categories, without any auditory feedback, and were asked to classify the tactile vibrations they felt into either category. The participants were blinded to the songs being tested and were not provided any feedback on the accuracy of their classification. These participants were able to classify the music with 100% accuracy. Although the songs tested were on two opposite spectrums (sad/happy), the preliminary results show the potential of utilizing a vibrotactile equalizer, like the one presented, for augmenting musical experience while furthering the current understanding of music tactile relationship.Keywords: haptic music relationship, tactile equalizer, tactile music, vibrations and mood
Procedia PDF Downloads 18168 Hormone Replacement Therapy (HRT) and Its Impact on the All-Cause Mortality of UK Women: A Matched Cohort Study 1984-2017
Authors: Nurunnahar Akter, Elena Kulinskaya, Nicholas Steel, Ilyas Bakbergenuly
Abstract:
Although Hormone Replacement Therapy (HRT) is an effective treatment in ameliorating menopausal symptoms, it has mixed effects on different health outcomes, increasing, for instance, the risk of breast cancer. Because of this, many symptomatic women are left untreated. Untreated menopausal symptoms may result in other health issues, which eventually put an extra burden and costs to the health care system. All-cause mortality analysis may explain the net benefits and risks of the HRT therapy. However, it received far less attention in HRT studies. This study investigated the impact of HRT on all-cause mortality using electronically recorded primary care data from The Health Improvement Network (THIN) that broadly represents the female population in the United Kingdom (UK). The study entry date for this study was the record of the first HRT prescription from 1984, and patients were followed up until death or transfer to another GP practice or study end date, which was January 2017. 112,354 HRT users (cases) were matched with 245,320 non-users by age at HRT initiation and general practice (GP). The hazards of all-cause mortality associated with HRT were estimated by a parametric Weibull-Cox model adjusting for a wide range of important medical, lifestyle, and socio-demographic factors. The multilevel multiple imputation techniques were used to deal with missing data. This study found that during 32 years of follow-up, combined HRT reduced the hazard ratio (HR) of all-cause mortality by 9% (HR: 0.91; 95% Confidence Interval, 0.88-0.94) in women of age between 46 to 65 at first treatment compared to the non-users of the same age. Age-specific mortality analyses found that combined HRT decreased mortality by 13% (HR: 0.87; 95% CI, 0.82-0.92), 12% (HR: 0.88; 95% CI, 0.82-0.93), and 8% (HR: 0.92; 95% CI, 0.85-0.98), in 51 to 55, 56 to 60, and 61 to 65 age group at first treatment, respectively. There was no association between estrogen-only HRT and women’s all-cause mortality. The findings from this study may help to inform the choices of women at menopause and to further educate the clinicians and resource planners.Keywords: hormone replacement therapy, multiple imputations, primary care data, the health improvement network (THIN)
Procedia PDF Downloads 17067 Approaches to Reduce the Complexity of Mathematical Models for the Operational Optimization of Large-Scale Virtual Power Plants in Public Energy Supply
Authors: Thomas Weber, Nina Strobel, Thomas Kohne, Eberhard Abele
Abstract:
In context of the energy transition in Germany, the importance of so-called virtual power plants in the energy supply continues to increase. The progressive dismantling of the large power plants and the ongoing construction of many new decentralized plants result in great potential for optimization through synergies between the individual plants. These potentials can be exploited by mathematical optimization algorithms to calculate the optimal application planning of decentralized power and heat generators and storage systems. This also includes linear or linear mixed integer optimization. In this paper, procedures for reducing the number of decision variables to be calculated are explained and validated. On the one hand, this includes combining n similar installation types into one aggregated unit. This aggregated unit is described by the same constraints and target function terms as a single plant. This reduces the number of decision variables per time step and the complexity of the problem to be solved by a factor of n. The exact operating mode of the individual plants can then be calculated in a second optimization in such a way that the output of the individual plants corresponds to the calculated output of the aggregated unit. Another way to reduce the number of decision variables in an optimization problem is to reduce the number of time steps to be calculated. This is useful if a high temporal resolution is not necessary for all time steps. For example, the volatility or the forecast quality of environmental parameters may justify a high or low temporal resolution of the optimization. Both approaches are examined for the resulting calculation time as well as for optimality. Several optimization models for virtual power plants (combined heat and power plants, heat storage, power storage, gas turbine) with different numbers of plants are used as a reference for the investigation of both processes with regard to calculation duration and optimality.Keywords: CHP, Energy 4.0, energy storage, MILP, optimization, virtual power plant
Procedia PDF Downloads 17866 Numerical Simulation of Production of Microspheres from Polymer Emulsion in Microfluidic Device toward Using in Drug Delivery Systems
Authors: Nizar Jawad Hadi, Sajad Abd Alabbas
Abstract:
Because of their ability to encapsulate and release drugs in a controlled manner, microspheres fabricated from polymer emulsions using microfluidic devices have shown promise for drug delivery applications. In this study, the effects of velocity, density, viscosity, and surface tension, as well as channel diameter, on microsphere generation were investigated using Fluent Ansys software. The software was programmed with the physical properties of the polymer emulsion such as density, viscosity and surface tension. Simulation will then be performed to predict fluid flow and microsphere production and improve the design of drug delivery applications based on changes in these parameters. The effects of capillary and Weber numbers are also studied. The results of the study showed that the size of the microspheres can be controlled by adjusting the speed and diameter of the channel. Narrower microspheres resulted from narrower channel widths and higher flow rates, which could improve drug delivery efficiency, while smaller microspheres resulted from lower interfacial surface tension. The viscosity and density of the polymer emulsion significantly affected the size of the microspheres, ith higher viscosities and densities producing smaller microspheres. The loading and drug release properties of the microspheres created with the microfluidic technique were also predicted. The results showed that the microspheres can efficiently encapsulate drugs and release them in a controlled manner over a period of time. This is due to the high surface area to volume ratio of the microspheres, which allows for efficient drug diffusion. The ability to tune the manufacturing process using factors such as speed, density, viscosity, channel diameter, and surface tension offers a potential opportunity to design drug delivery systems with greater efficiency and fewer side effects.Keywords: polymer emulsion, microspheres, numerical simulation, microfluidic device
Procedia PDF Downloads 6465 Corporate Digital Responsibility in Construction Engineering-Construction 4.0: Ethical Guidelines for Digitization and Artificial Intelligence
Authors: Weber-Lewerenz Bianca
Abstract:
Digitization is developing fast and has become a powerful tool for digital planning, construction, and operations. Its transformation bears high potentials for companies, is critical for success, and thus, requires responsible handling. This study provides an assessment of calls made in the sustainable development goals by the United Nations (SDGs), White Papers on AI by international institutions, EU-Commission and German Government requesting for the consideration and protection of values and fundamental rights, the careful demarcation between machine (artificial) and human intelligence and the careful use of such technologies. The study discusses digitization and the impacts of artificial intelligence (AI) in construction engineering from an ethical perspective by generating data via conducting case studies and interviewing experts as part of the qualitative method. This research evaluates critically opportunities and risks revolving around corporate digital responsibility (CDR) in the construction industry. To the author's knowledge, no study has set out to investigate how CDR in construction could be conceptualized, especially in relation to the digitization and AI, to mitigate digital transformation both in large, medium-sized, and small companies. No study addressed the key research question: Where can CDR be allocated, how shall its adequate ethical framework be designed to support digital innovations in order to make full use of the potentials of digitization and AI? Now is the right timing for constructive approaches and apply ethics-by-design in order to develop and implement a safe and efficient AI. This represents the first study in construction engineering applying a holistic, interdisciplinary, inclusive approach to provide guidelines for orientation, examine benefits of AI and define ethical principles as the key driver for success, resources-cost-time efficiency, and sustainability using digital technologies and AI in construction engineering to enhance digital transformation. Innovative corporate organizations starting new business models are more likely to succeed than those dominated by conservative, traditional attitudes.Keywords: construction engineering, digitization, digital transformation, artificial intelligence, ethics, corporate digital responsibility, digital innovation
Procedia PDF Downloads 24864 Study on Capability of the Octocopter Configurations in Finite Element Analysis Simulation Environment
Authors: Jeet Shende, Leonid Shpanin, Misko Abramiuk, Mattew Goodwin, Nicholas Pickett
Abstract:
Energy harvesting on board the Unmanned Ariel Vehicle (UAV) is one of the most rapidly growing emerging technologies and consists of the collection of small amounts of energy, for different applications, from unconventional sources that are incidental to the operation of the parent system or device. Different energy harvesting techniques have already been investigated in the multirotor drones, where the energy collected comes from the systems surrounding ambient environment and typically involves the conversion of solar, kinetic, or thermal energies into electrical energy. The energy harvesting from the vibrated propeller using the piezoelectric components inside the propeller has also been proven to be feasible. However, the impact on the UAV flight performance using this technology has not been investigated. In this contribution the impact on the multirotor drone operation has been investigated at different flight control configurations which support the efficient performance of the propeller vibration energy harvesting. The industrially made MANTIS X8-PRO octocopter frame kit was used to explore the octocopter operation which was modelled using SolidWorks 3D CAD package for simulation studies. The octocopter flight control strategy is developed through integration of the SolidWorks 3D CAD software and MATLAB/Simulink simulation environment for evaluation of the octocopter behaviour under different simulated flight modes and octocopter geometries. Analysis of the two modelled octocopter geometries and their flight performance is presented via graphical representation of simulated parameters. The possibility of not using the landing gear in octocopter geometry is demonstrated. The conducted study evaluates the octocopter’s flight control technique and its impact on the energy harvesting mechanism developed on board the octocopter. Finite Element Analysis (FEA) simulation results of the modelled octocopter in operation are presented exploring the performance of the octocopter flight control and structural configurations. Applications of both octocopter structures and their flight control strategy are discussed.Keywords: energy harvesting, flight control modelling, object modeling, unmanned aerial vehicle
Procedia PDF Downloads 7663 Rapid Formation of Ortho-Boronoimines and Derivatives for Reversible and Dynamic Bioconjugation Under Physiological Conditions
Authors: Nicholas C. Rose, Christopher D. Spicer
Abstract:
The regeneration of damaged or diseased tissues would provide an invaluable therapeutic tool in biological research and medicine. Cells must be provided with a number of different biochemical signals in order to form mature tissue through complex signaling networks that are difficult to recreate in synthetic materials. The ability to attach and detach bioactive proteins from material in an iterative and dynamic manner would therefore present a powerful way to mimic natural biochemical signaling cascades for tissue growth. We propose to reversibly attach these bioactive proteins using ortho-boronoimine (oBI) linkages and related derivatives formed by the reaction of an ortho-boronobenzaldehyde with a nucleophilic amine derivative. To enable the use of oBIs for biomaterial modification, we have studied binding and cleavage processes with precise detail in the context of small molecule models. A panel of oBI complexes has been synthesized and screened using a novel Förster resonance energy transfer (FRET) assay, using a cyanine dye FRET pair (Cy3 and Cy5), to identify the most reactive boron-aldehyde/amine nucleophile pairs. Upon conjugation of the dyes, FRET occurs under Cy3 excitation and the resultant ratio of Cy3:Cy5 emission directly correlates to conversion. Reaction kinetics and equilibria can be accurately quantified for reactive pairs, with dissociation constants of oBI derivatives in water (KD) found to span 9-orders of magnitude (10⁻²-10⁻¹¹ M). These studies have provided us with a better understanding of oBI linkages that we hope to exploit to reversibly attach bioconjugates to materials. The long-term aim of the project is to develop a modular biomaterial platform that can be used to help combat chronic diseases such as osteoarthritis, heart disease, and chronic wounds by providing cells with potent biological stimuli for tissue engineering.Keywords: dynamic, bioconjugation, bornoimine, rapid, physiological
Procedia PDF Downloads 9662 Performance Management in Higher Education: Lessons from Germany's New Public Management System
Authors: Patrick Oehler, Nicholas Folger
Abstract:
Following a new public management approach, Germany has widely reformed its higher education system around the turn of the millennium. Aimed at preparing the country’s publicly funded universities and applied science colleges for a century of glory, the reforms led to the introduction of rigid performance measurement and management practices, which disrupted the inert system on all levels. Yet, many of the new policies met significant resistance, and some of them had to be reversed over time. Ever since Germany has struggled to find a balance between its pre- and its post-millennial approach to performance measurement and management. This contribution combines insights of a joint research project, which was created and funded by the German Federal Ministry of Education and Research with the aim to better understand the effects of its performance measurement and management policies, including those the ministry had implemented over the previous decades. The research project combines researchers from 17 German research institutions who employed a wide range of theories from various disciplines and very diverse research methods to explain performance measurement and management and their consequences on the behavior of various stakeholders in higher education systems. In these projects, performance measurement and management have been researched from three angles—education, research, and third mission. The collaborative project differentiated functional and dysfunctional elements of common performance measurement and management practices, and identified key problems with these practices, such as (1) oversimplification of performance indicators, (2) ‘overmeasurement’ of performance in general, (3) excessive use of quantitative indicators, and (4), a myopic focus on research-focused indicators and a negligence of measures targeting education and third mission. To address these issues, the collaborative project developed alternative approaches to performance measurement and management, including suggestions for qualitative performance measures, improved supervision, review, and evaluations methods, and recommendations how to better balance education, research, and third mission. The authors would like to share the rich findings of the joint research project with an international audience and discuss their implications for alternative higher education systems.Keywords: performance measurement, performance management, new public management, performance evaluation
Procedia PDF Downloads 27061 De Novo Design of Functional Metalloproteins for Biocatalytic Reactions
Authors: Ketaki D. Belsare, Nicholas F. Polizzi, Lior Shtayer, William F. DeGrado
Abstract:
Nature utilizes metalloproteins to perform chemical transformations with activities and selectivities that have long been the inspiration for design principles in synthetic and biological systems. The chemical reactivities of metalloproteins are directly linked to local environment effects produced by the protein matrix around the metal cofactor. A complete understanding of how the protein matrix provides these interactions would allow for the design of functional metalloproteins. The de novo computational design of proteins have been successfully used in design of active sites that bind metals like di-iron, zinc, copper containing cofactors; however, precisely designing active sites that can bind small molecule ligands (e.g., substrates) along with metal cofactors is still a challenge in the field. The de novo computational design of a functional metalloprotein that contains a purposefully designed substrate binding site would allow for precise control of chemical function and reactivity. Our research strategy seeks to elucidate the design features necessary to bind the cofactor protoporphyrin IX (hemin) in close proximity to a substrate binding pocket in a four helix bundle. First- and second-shell interactions are computationally designed to control orientation, electronic structure, and reaction pathway of the cofactor and substrate. The design began with a parameterized helical backbone that positioned a single histidine residue (as an axial ligand) to receive a second-shell H-bond from a Threonine on the neighboring helix. The metallo-cofactor, hemin was then manually placed in the binding site. A structural feature, pi-bulge was introduced to give substrate access to the protoporphyrin IX. These de novo metalloproteins are currently being tested for their activity towards hydroxylation and epoxidation. The de novo designed protein shows hydroxylation of aniline to 4-aminophenol. This study will help provide structural information of utmost importance in understanding de novo computational design variables impacting the functional activities of a protein.Keywords: metalloproteins, protein design, de novo protein, biocatalysis
Procedia PDF Downloads 15160 Numerical Study of the Breakdown of Surface Divergence Based Models for Interfacial Gas Transfer Velocity at Large Contamination Levels
Authors: Yasemin Akar, Jan G. Wissink, Herlina Herlina
Abstract:
The effect of various levels of contamination on the interfacial air–water gas transfer velocity is studied by Direct Numerical Simulation (DNS). The interfacial gas transfer is driven by isotropic turbulence, introduced at the bottom of the computational domain, diffusing upwards. The isotropic turbulence is generated in a separate, concurrently running the large-eddy simulation (LES). The flow fields in the main DNS and the LES are solved using fourth-order discretisations of convection and diffusion. To solve the transport of dissolved gases in water, a fifth-order-accurate WENO scheme is used for scalar convection combined with a fourth-order central discretisation for scalar diffusion. The damping effect of the surfactant contamination on the near surface (horizontal) velocities in the DNS is modelled using horizontal gradients of the surfactant concentration. An important parameter in this model, which corresponds to the level of contamination, is ReMa⁄We, where Re is the Reynolds number, Ma is the Marangoni number, and We is the Weber number. It was previously found that even small levels of contamination (ReMa⁄We small) lead to a significant drop in the interfacial gas transfer velocity KL. It is known that KL depends on both the Schmidt number Sc (ratio of the kinematic viscosity and the gas diffusivity in water) and the surface divergence β, i.e. K_L∝√(β⁄Sc). Previously it has been shown that this relation works well for surfaces with low to moderate contamination. However, it will break down for β close to zero. To study the validity of this dependence in the presence of surface contamination, simulations were carried out for ReMa⁄We=0,0.12,0.6,1.2,6,30 and Sc = 2, 4, 8, 16, 32. First, it will be shown that the scaling of KL with Sc remains valid also for larger ReMa⁄We. This is an important result that indicates that - for various levels of contamination - the numerical results obtained at low Schmidt numbers are also valid for significantly higher and more realistic Sc. Subsequently, it will be shown that - with increasing levels of ReMa⁄We - the dependency of KL on β begins to break down as the increased damping of near surface fluctuations results in an increased damping of β. Especially for large levels of contamination, this damping is so severe that KL is found to be underestimated significantly.Keywords: contamination, gas transfer, surfactants, turbulence
Procedia PDF Downloads 30059 Pathologies in the Left Atrium Reproduced Using a Low-Order Synergistic Numerical Model of the Cardiovascular System
Authors: Nicholas Pearce, Eun-jin Kim
Abstract:
Pathologies of the cardiovascular (CV) system remain a serious and deadly health problem for human society. Computational modelling provides a relatively accessible tool for diagnosis, treatment, and research into CV disorders. However, numerical models of the CV system have largely focused on the function of the ventricles, frequently overlooking the behaviour of the atria. Furthermore, in the study of the pressure-volume relationship of the heart, which is a key diagnosis of cardiac vascular pathologies, previous works often evoke popular yet questionable time-varying elastance (TVE) method that imposes the pressure-volume relationship instead of calculating it consistently. Despite the convenience of the TVE method, there have been various indications of its limitations and the need for checking its validity in different scenarios. A model of the combined left ventricle (LV) and left atrium (LA) is presented, which consistently considers various feedback mechanisms in the heart without having to use the TVE method. Specifically, a synergistic model of the left ventricle is extended and modified to include the function of the LA. The synergy of the original model is preserved by modelling the electro-mechanical and chemical functions of the micro-scale myofiber for the LA and integrating it with the microscale and macro-organ-scale heart dynamics of the left ventricle and CV circulation. The atrioventricular node function is included and forms the conduction pathway for electrical signals between the atria and ventricle. The model reproduces the essential features of LA behaviour, such as the two-phase pressure-volume relationship and the classic figure of eight pressure-volume loops. Using this model, disorders in the internal cardiac electrical signalling are investigated by recreating the mechano-electric feedback (MEF), which is impossible where the time-varying elastance method is used. The effects of AV node block and slow conduction are then investigated in the presence of an atrial arrhythmia. It is found that electrical disorders and arrhythmia in the LA degrade the CV system by reducing the cardiac output, power, and heart rate.Keywords: cardiovascular system, left atrium, numerical model, MEF
Procedia PDF Downloads 11558 Stress Hyperglycaemia and Glycaemic Control Post Cardiac Surgery: Relaxed Targets May Be Acceptable
Authors: Nicholas Bayfield, Liam Bibo, Charley Budgeon, Robert Larbalestier, Tom Briffa
Abstract:
Introduction: Stress hyperglycaemia is common following cardiac surgery. Its optimal management is uncertain and may differ by diabetic status. This study assesses the in-hospital glycaemic management of cardiac surgery patients and associated postoperative outcomes. Methods: A retrospective cohort analysis of all patients undergoing cardiac surgery at Fiona Stanley Hospital from February 2015 to May 2019 was undertaken. Management and outcomes of hyperglycaemia following cardiac surgery were assessed. Follow-up was assessed to 1 year postoperatively. Multivariate regression modelling was utilised. Results: 1050 non-diabetic patients and 689 diabetic patients were included. In the non-diabetic cohort, patients with mild (peak blood sugar level [BSL] < 14.3), transient stress hyperglycaemia managed without insulin were not at an increased risk of wound-related morbidity (P=0.899) or mortality at 1 year (P=0.483). Insulin management was associated with wound-related readmission to hospital (P=0.004) and superficial sternal wound infection (P=0.047). Prolonged or severe stress hyperglycaemia was predictive of hospital re-admission (P=0.050) but not morbidity or mortality (P=0.546). Diabetes mellitus was an independent risk factor 1-year mortality (OR; 1.972 [1.041–3.736], P=0.037), graft harvest site wound infection (OR; 1.810 [1.134–2.889], P=0.013) and wound-related readmission (OR; 1.866 [1.076–3.236], P=0.026). In diabetics, postoperative peak BSL > 13.9mmol/L was predictive of graft harvest site infections (OR; 3.528 [1.724-7.217], P=0.001) and wound-related readmission OR; 3.462 [1.540-7.783], P=0.003) regardless of modality of management. A peak BSL of 10.0-13.9 did not increase the risk of morbidity/mortality compared to a peak BSL of < 10.0 (P=0.557). Diabetics with a peak BSL of 13.9 or less did not have significantly increased morbidity/mortality outcomes compared to non-diabetics (P=0.418). Conclusion: In non-diabetic patients, transient mild stress hyperglycaemia following cardiac surgery does not uniformly require treatment. In diabetic patients, postoperative hyperglycaemia with peak BSL exceeding 13.9mmol/L was associated with wound-related morbidity and hospital readmission following cardiac surgery.Keywords: cardiac surgery, pulmonary embolism, pulmonary embolectomy, cardiopulmonary bypass
Procedia PDF Downloads 16257 The Effects of Physiological Stress on Global and Regional Repolarisation in the Human Heart in Vivo
Authors: May Khei Hu, Kevin Leong, Fu Siong Ng, Nicholas Peter
Abstract:
Introduction: Sympathetic stimulation has been recognised as a potent stimulus of arrhythmogenesis in various cardiac pathologies, possibly by augmenting dispersion of repolarisation. The effects of sympathetic stimulation in healthy subjects however remain unclear. It is, therefore, crucial to first establish the effects of physiological stress on dispersion of repolarisation in healthy subjects before understanding these effects in pathological cardiac conditions. We hypothesised that activation-recovery interval (ARI; which is a surrogate of action potential duration) and dispersion of repolarisation decrease on sympathetic stimulation. Methods: Eight patients aged 18-55 years with structurally normal hearts underwent head-up tilt test (HUTT) and exercise tolerance test (ETT) while wearing the electrocardiographic imaging (ECGi) vest. Patients later underwent CT scan and the epicardial potentials are reconstructed using the ECGi software. Activation and recovery times were determined from the acquired electrograms. ARI was calculated and later corrected using Bazett’s formula. Global and regional dispersion of repolarisation were determined from standard deviation of the corrected ARI (ARIc). One-way analysis of variance (ANOVA) and Wilcoxon test were used to evaluate statistical significance. Results: Global ARIc increased significantly [p<0.01] when patients were tilted upwards but decreased significantly after five minutes [p<0.01]. A subsequent post- hoc analysis revealed that the decrease in R-R was more substantial compared to the change in ARI, resulting in the observed increase in ARIc. Global ARIc decreased on peak exercise [p<0.01] but increased on recovery [p<0.01]. Global dispersion increased significantly on peak exercise [p<0.05] although there were no significant changes in regional dispersion. There were no significant changes in both global and regional dispersion during tilt. Conclusion: ARIc decreases upon sympathetic stimulation in healthy subjects. Global dispersion of repolarisation increases upon exercise although there were no changes in global or regional dispersion during orthostatic stress.Keywords: dispersion of repolarisation, sympathetic stimulation, Head-up tilt test (HUTT), Exercise tolerance test (ETT), Electrocardiographic imaging (ECGi)
Procedia PDF Downloads 19756 Strategies For Management Of Massive Intraoperative Airway Haemorrhage Complicating Surgical Pulmonary Embolectomy
Authors: Nicholas Bayfield, Liam Bibo, Kaushelandra Rathore, Lucas Sanders, Mark Newman
Abstract:
INTRODUCTION: Surgical pulmonary embolectomy is an established therapy for acute pulmonary embolism causing right heart dysfunction and haemodynamic instability. Massive intraoperative airway haemorrhage is a rare complication of pulmonary embolectomy. We present our institutional experience with massive airway haemorrhage complicating pulmonary embolectomy and discuss optimal therapeutic strategies. METHODS: A retrospective review of emergent surgical pulmonary embolectomy patients was undertaken. Cases complicated by massive intra-operative airway haemorrhage were identified. Intra- and peri-operative management strategies were analysed and discussed. RESULTS: Of 76 patients undergoing emergent or salvage pulmonary embolectomy, three cases (3.9%) of massive intraoperative airway haemorrhage were identified. Haemorrhage always began on weaning from cardiopulmonary bypass. Successful management strategies involved intraoperative isolation of the side of bleeding, occluding the affected airway with an endobronchial blocker, institution of veno-arterial (VA) extracorporeal membrane oxygenation (ECMO) and reversal of anticoagulation. Running the ECMO without heparinisation allows coagulation to occur. Airway haemorrhage was controlled within 24 hours of operation in all patients, allowing re-institution of dual lung ventilation and decannulation from ECMO. One case in which positive end-expiratory airway pressure was trialled initially was complicated by air embolism. Although airway haemorrhage was controlled successfully in all cases, all patients died in-hospital for reasons unrelated to the airway haemorrhage. CONCLUSION: Massive intraoperative airway haemorrhage during pulmonary embolectomy is a rare complication with potentially catastrophic outcomes. Re-perfusion alveolar and capillary injury is the likely aetiology. With a systematic approach to management, airway haemorrhage can be well controlled intra-operatively and often resolves within 24 hours. Stopping blood flow to the pulmonary arteries and support of oxygenation by the institution of VA ECMO is important. This management has been successful in our 3 cases.Keywords: pulmonary embolectomy, cardiopulmonary bypass, cardiac surgery, pulmonary embolism
Procedia PDF Downloads 17655 Mitigating Self-Regulation Issues in the Online Instruction of Math
Authors: Robert Vanderburg, Michael Cowling, Nicholas Gibson
Abstract:
Mathematics is one of the core subjects taught in the Australian K-12 education system and is considered an important component for future studies in areas such as engineering and technology. In addition to this, Australia has been a world leader in distance education due to the vastness of its geographic landscape. Despite this, research is still needed on distance math instruction. Even though delivery of curriculum has given way to online studies, and there is a resultant push for computer-based (PC, tablet, smartphone) math instruction, much instruction still involves practice problems similar to those original curriculum packs, without the ability for students to self-regulate their learning using the full interactive capabilities of these devices. Given this need, this paper addresses issues students have during online instruction. This study consists of 32 students struggling with mathematics enrolled in a math tutorial conducted in an online setting. The study used a case study design to understand some of the blockades hindering the students’ success. Data was collected by tracking students practice and quizzes, tracking engagement of the site, recording one-on-one tutorials, and collecting data from interviews with the students. Results revealed that when students have cognitively straining tasks in an online instructional setting, the first thing to dissipate was their ability to self-regulate. The results also revealed that instructors could ameliorate the situation and provided useful data on strategies that could be used for designing future online tasks. Specifically, instructors could utilize cognitive dissonance strategies to reduce the cognitive drain of the tasks online. They could segment the instruction process to reduce the cognitive demands of the tasks and provide in-depth self-regulatory training, freeing mental capacity for the mathematics content. Finally, instructors could provide specific scheduling and assignment structure changes to reduce the amount of student centered self-regulatory tasks in the class. These findings will be discussed in more detail and summarized in a framework that can be used for future work.Keywords: digital education, distance education, mathematics education, self-regulation
Procedia PDF Downloads 13654 Periplasmic Expression of Anti-RoxP Antibody Fragments in Escherichia Coli.
Authors: Caspar S. Carson, Gabriel W. Prather, Nicholas E. Wong, Jeffery R. Anton, William H. McCoy
Abstract:
Cutibacterium acnes is a commensal bacterium found on human skin that has been linked to acne. C. acnes can also be an opportunistic pathogen when it infiltrates the body during surgery. This pathogen can cause dangerous infections of medical implants, such as shoulder replacements, leading to life-threatening blood infections. Compounding this issue, C. acnes resistance to many antibiotics has become an increasing problem worldwide, creating a need for special forms of treatment. C. acnes expresses the protein RoxP, and it requires this protein to colonize human skin. Though this protein is required for C. acnes skin colonization, its function is not yet understood. Inhibition of RoxP function might be an effective treatment for C. acnes infections. To develop such reagents, the McCoy Laboratory generated four unique anti-RoxP antibodies. Preliminary studies in the McCoy Lab have established that each antibody binds a distinct site on RoxP. To assess the potential of these antibodies as therapeutics, it is necessary to specifically characterize these antibody epitopes and evaluate them in assays that assess their ability to inhibit RoxP-dependent C. acnes growth. To provide material for these studies, an antibody expression construct, Fv-clasp(v2), was adapted to encode anti-RoxP antibody sequences. The author hypothesizes that this expression strategy can produce sufficient amounts of >95% pure antibody fragments for further characterization of these antibodies. Four anti-RoxP Fv-clasp(v2) expression constructs (pET vector-based) were transformed into E. coli BL21-Gold(DE3) cells and a small-scale expression and purification trial was performed for each construct to evaluate anti-RoxP Fv-clasp(v2) yield and purity. Successful expression and purification of these antibody constructs will allow for their use in structural studies, such as protein crystallography and cryogenic electron microscopy. Such studies would help to define the antibody binding sites on RoxP, which could then be leveraged in the development of certain methods to treat C. acnes infection through RoxP inhibition.Keywords: structural biology, protein expression, infectious disease, antibody, therapeutics, E. coli
Procedia PDF Downloads 6053 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data
Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang
Abstract:
Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.Keywords: biomarker, congenital heart defects, DNA methylation, random forest
Procedia PDF Downloads 15852 Autophagy Defects That Modify Human Immune Cell Metabolism and Promote Aging-Associated Inflammation
Authors: Grace McCambridge, Alanna Keady, Madhur Agrawal, Dequina Nicholas Alvarado, Barbara Nikolajczyk, Leena Panneerseelan-Bharath
Abstract:
Age is a non-modifiable risk factor for the inflammation that underlies pathologies such as type 2 diabetes mellitus (T2DM). Inflammation, as indicated by circulating cytokines, rises in aging, but mechanisms that promote this ‘inflammaging’ remain poorly defined. Furthermore, downstream consequences of inflammaging, including the development of an inflammatory profile that predicts comorbidities like T2DM, remain speculative. We tested the possibility that natural aging-associated changes in autophagy, a process that is compromised in both aging and T2DM, regulates inflammatory profiles in older subjects. Our data showed that circulating CD4⁺ T cells from older compared to younger subjects have (i) defects in autophagy; (ii) higher mitochondria accumulation; (iii) a failure to metabolically shift from oxidative phosphorylation to anaerobic glycolysis upon αCD3/CD28 activation; (iv) more reactive oxygen species (ROS) accumulation; and (v) a cytokine profile that recapitulates the Th17 profile that predicts T2DM. ROS scavenging in cells from older subjects restored mitochondrial mass and membrane potential (indicators of improved autophagy) and reduced Th17 cytokines to amounts made by T cells from younger subjects. Knock-down of the autophagy protein Atg3 in T cells from younger subjects increased mitochondrial accumulation and Th17 cytokines. To begin translating these findings to clinical practice, we showed that physiological concentrations of the diabetes drug metformin (100 µM) added in vitro enhanced autophagy, prevented mitochondria and ROS accumulation, increased anaerobic glycolysis, and decreased Th17 cytokines in activated CD4⁺ T cells from older subjects. Metformin therefore improves autophagy and multiple downstream pro-inflammatory mechanisms CD4⁺ T cells from older subjects. We conclude that autophagy improvement ameliorates the development of a T2DM-predictive Th17 profile in aging, and thus holds promise for delay or prevention of aging-associated metabolic decline.Keywords: autophagy, mitochondrial turnover, ROS, glycolysis
Procedia PDF Downloads 16451 Adsorption and Desorption Behavior of Ionic and Nonionic Surfactants on Polymer Surfaces
Authors: Giulia Magi Meconi, Nicholas Ballard, José M. Asua, Ronen Zangi
Abstract:
Experimental and computational studies are combined to elucidate the adsorption proprieties of ionic and nonionic surfactants on hydrophobic polymer surface such us poly(styrene). To present these two types of surfactants, sodium dodecyl sulfate and poly(ethylene glycol)-block-poly(ethylene), commonly utilized in emulsion polymerization, are chosen. By applying quartz crystal microbalance with dissipation monitoring it is found that, at low surfactant concentrations, it is easier to desorb (as measured by rate) ionic surfactants than nonionic surfactants. From molecular dynamics simulations, the effective, attractive force of these nonionic surfactants to the surface increases with the decrease of their concentration, whereas, the ionic surfactant exhibits mildly the opposite trend. The contrasting behavior of ionic and nonionic surfactants critically relies on two observations obtained from the simulations. The first is that there is a large degree of interweavement between head and tails groups in the adsorbed layer formed by the nonionic surfactant (PEO/PE systems). The second is that water molecules penetrate this layer. In the disordered layer, these nonionic surfactants generate at the surface, only oxygens of the head groups present at the interface with the water phase or oxygens next to the penetrating waters can form hydrogen bonds. Oxygens inside this layer lose this favorable energy, with a magnitude that increases with the surfactants density at the interface. This reduced stability of the surfactants diminishes their driving force for adsorption. All that is shown to be in accordance with experimental results on the dynamics of surfactants desorption. Ionic surfactants assemble into an ordered structure and the attraction to the surface was even slightly augmented at higher surfactant concentration, in agreement with the experimentally determined adsorption isotherm. The reason these two types of surfactants behave differently is because the ionic surfactant has a small head group that is strongly hydrophilic, whereas the head groups of the nonionic surfactants are large and only weakly attracted to water.Keywords: emulsion polymerization process, molecular dynamics simulations, polymer surface, surfactants adsorption
Procedia PDF Downloads 34350 Development of Positron Emission Tomography (PET) Tracers for the in-Vivo Imaging of α-Synuclein Aggregates in α-Synucleinopathies
Authors: Bright Chukwunwike Uzuegbunam, Wojciech Paslawski, Hans Agren, Christer Halldin, Wolfgang Weber, Markus Luster, Thomas Arzberger, Behrooz Hooshyar Yousefi
Abstract:
There is a need to develop a PET tracer that will enable to diagnosis and track the progression of Alpha-synucleinopathies (Parkinson’s disease [PD], dementia with Lewy bodies [DLB], multiple system atrophy [MSA]) in living subjects over time. Alpha-synuclein aggregates (a-syn), which are present in all the stages of disease progression, for instance, in PD, are a suitable target for in vivo PET imaging. For this reason, we have developed some promising a-syn tracers based on a disarylbisthiazole (DABTA) scaffold. The precursors are synthesized via a modified Hantzsch thiazole synthesis. The precursors were then radiolabeled via one- or two-step radiofluorination methods. The ligands were initially screened using a combination of molecular dynamics and quantum/molecular mechanics approaches in order to calculate the binding affinity to a-syn (in silico binding experiments). Experimental in vitro binding assays were also performed. The ligands were further screened in other experiments such as log D, in vitro plasma protein binding & plasma stability, biodistribution & brain metabolite analyses in healthy mice. Radiochemical yields were up to 30% - 72% in some cases. Molecular docking revealed possible binding sites in a-syn and also the free energy of binding to those sites (-28.9 - -66.9 kcal/mol), which correlated to the high binding affinity of the DABTAs to a-syn (Ki as low as 0.5 nM) and selectivity (> 100-fold) over Aβ and tau, which usually co-exist with a-synin some pathologies. The log D values range from 2.88 - 2.34, which correlated with free-protein fraction of 0.28% - 0.5%. Biodistribution experiments revealed that the tracers are taken up (5.6 %ID/g - 7.3 %ID/g) in the brain at 5 min (post-injection) p.i., and cleared out (values as low as 0.39 %ID/g were obtained at 120 min p.i. Analyses of the mice brain 20 min p.i. Revealed almost no radiometabolites in the brain in most cases. It can be concluded that in silico study presents a new venue for the rational development of radioligands with suitable features. The results obtained so far are promising and encourage us to further validate the DABTAs in autoradiography, immunohistochemistry, and in vivo imaging in non-human primates and humans.Keywords: alpha-synuclein aggregates, alpha-synucleinopathies, PET imaging, tracer development
Procedia PDF Downloads 23549 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study
Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy
Abstract:
Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy
Procedia PDF Downloads 11948 How Childhood Trauma Changes the Recovery Models
Authors: John Michael Weber
Abstract:
The following research results spanned six months and 175 people addicted to some form of substance, from alcohol to heroin. One question was asked, and the answers were amazing and consistent. The following work is the detailed results of this writer’s answer to his own question and the 175 that followed. A constant pattern took shape throughout the bio-psycho-social assessments, these addicts had “first memories,” the memories were vivid and took place between the ages of three to six years old, to a person those first memories were traumatic. This writer’s personal search into his childhood was not to find an excuse for the way he became, but to explain the reason for becoming an addict. To treat addiction, these memories that have caused Post Traumatic Stress Disorder (PTSD), must be recognized as the catalyst that sparked a predisposition. Cognitive Behavioral Therapy (CBT), integrated with treatment specifically focused on PTSD, gives the addict a better chance at recovery sans relapse. This paper seeks to give the findings of first memories of the addicts assessed and provide the best treatment plan for such an addict, considering, the childhood trauma in congruence with treatment of the Substance Use Disorder (SUD). The question posed was concerning what their first life memory wa It is the hope of this author to take the knowledge that trauma is one of the main catalysts for addiction, will allow therapists to provide better treatment and reduce relapse from abstinence from drugs and alcohol. This research led this author to believe that if treatment of childhood trauma is not a priority, the twelve steps of Alcoholics Anonymous, specifically steps 4 and 5, will not be thoroughly addressed and odds for relapse increase. With this knowledge, parents can be educated on childhood trauma and the effect it has on their children. Parents could be mindful of the fact that the things they perceive as traumatic, do not match what a child, in the developmental years, absorbs as traumatic. It is this author’s belief that what has become the status quo in treatment facilities has not been working for a long time. It is for that reason this author believes things need to change. Relapse has been woven into the fabric of standard operating procedure and that, in this authors view, is not necessary. Childhood Trauma is not being addressed early in recovery and that creates an environment of inevitable relapse. This paper will explore how to break away from the status -quo and rethink the current “evidencebased treatments.” To begin breaking away from status-quo, this ends the Abstract, with hopes an interest has been peaked to read on.Keywords: childood, trauma, treatment, addiction, change
Procedia PDF Downloads 7947 Maresin Like 1 Treatment: Curbing the Pathogenesis of Behavioral Dysfunction and Neurodegeneration in Alzheimer's Disease Mouse Model
Authors: Yan Lu, Song Hong, Janakiraman Udaiyappan, Aarti Nagayach, Quoc-Viet A. Duong, Masao Morita, Shun Saito, Yuichi Kobayashi, Yuhai, Zhao, Hongying Peng, Nicholas B. Pham, Walter J Lukiw, Christopher A. Vuong, Nicolas G. Bazan
Abstract:
Aims: Neurodegeneration and behavior dysfunction occurs in patients with Alzheimer's Disease (AD), and as the disease progresses many patients develop cognitive impairment. 5XFAD mouse model of AD is widely used to study AD pathogenesis and treatment. This study aimed to investigate the effect of maresin like 1 (MaR-L1) treatment in AD pathology using 5XFAD mice. Methods: We tested 12-month-old male 5XFAD mice and wild type control mice treated with MaR-L1 in a battery of behavioral tasks. We performed open field test, beam walking test, clasping test, inverted grid test, acetone test, marble burring test, elevated plus maze test, cross maze test and novel object recognition test. We also studied neuronal loss, amyloid β burden, and inflammation in the brains of 5XFAD mice using immunohistology and Western blotting. Results: MaR-L1 treatment to the 5XFAD mice showed improved cognitive function of 5XFAD mice. MaR-L1 showed decreased anxiety behavior in open field test and marble burring test, increased muscular strength in the beam walking test, clasping test and inverted grid test. Cognitive function was improved in MaR-L1 treated 5XFAD mice in the novel object recognition test. MaR-L1 prevented neuronal loss and aberrant inflammation. Conclusion: Our finding suggests that behavioral abnormalities were normalized by the administration of MaR-L1 and the neuroprotective role of MaR-L1 in the AD. It also indicates that MaR-L1 treatment is able to prevent and or ameliorate neuronal loss and aberrant inflammation. Further experiments to validate the results are warranted using other AD models in the future.Keywords: Alzheimer's disease, motor and cognitive behavior, 5XFAD mice, Maresin Like 1, microglial cell, astrocyte, neurodegeneration, inflammation, resolution of inflammation
Procedia PDF Downloads 17846 Effect of Repellent Coatings, Aerosol Protective Liners, and Lamination on the Properties of Chemical/Biological Protective Textiles
Authors: Natalie Pomerantz, Nicholas Dugan, Molly Richards, Walter Zukas
Abstract:
The primary research question to be answered for Chemical/Biological (CB) protective clothing, is how to protect wearers from a range of chemical and biological threats in liquid, vapor, and aerosol form, while reducing the thermal burden. Currently, CB protective garments are hot, heavy, and wearers are limited by short work times in order to prevent heat injury. This study demonstrates how to incorporate different levels of protection on a material level and modify fabric composites such that the thermal burden is reduced to such an extent it approaches that of a standard duty uniform with no CB protection. CB protective materials are usually comprised of several fabric layers: a cover fabric with a liquid repellent coating, a protective layer which is comprised of a carbon-based sorptive material or semi-permeable membrane, and a comfort next-to-skin liner. In order to reduce thermal burden, all of these layers were laminated together to form one fabric composite which had no insulative air gap in between layers. However, the elimination of the air gap also reduced the CB protection of the fabric composite. In order to increase protection in the laminated composite, different nonwoven aerosol protective liners were added, and a super repellent coating was applied to the cover fabric, prior to lamination. Different adhesive patterns were investigated to determine the durability of the laminate with the super repellent coating, and the effect on air permeation. After evaluating the thermal properties, textile properties and protective properties of the iterations of these fabric composites, it was found that the thermal burden of these materials was greatly reduced by decreasing the thermal resistance with the elimination of the air gap between layers. While the level of protection was reduced in laminate composites, the addition of a super repellent coating increased protection towards low volatility agents without impacting thermal burden. Similarly, the addition of aerosol protective liner increased protection without reducing water vapor transport, depending on the nonwoven used, however, the air permeability was significantly decreased. The balance of all these properties and exploration of the trade space between thermal burden and protection will be discussed.Keywords: aerosol protection, CBRNe protection, lamination, nonwovens, repellent coatings, thermal burden
Procedia PDF Downloads 36445 A Double Ended AC Series Arc Fault Location Algorithm Based on Currents Estimation and a Fault Map Trace Generation
Authors: Edwin Calderon-Mendoza, Patrick Schweitzer, Serge Weber
Abstract:
Series arc faults appear frequently and unpredictably in low voltage distribution systems. Many methods have been developed to detect this type of faults and commercial protection systems such AFCI (arc fault circuit interrupter) have been used successfully in electrical networks to prevent damage and catastrophic incidents like fires. However, these devices do not allow series arc faults to be located on the line in operating mode. This paper presents a location algorithm for series arc fault in a low-voltage indoor power line in an AC 230 V-50Hz home network. The method is validated through simulations using the MATLAB software. The fault location method uses electrical parameters (resistance, inductance, capacitance, and conductance) of a 49 m indoor power line. The mathematical model of a series arc fault is based on the analysis of the V-I characteristics of the arc and consists basically of two antiparallel diodes and DC voltage sources. In a first step, the arc fault model is inserted at some different positions across the line which is modeled using lumped parameters. At both ends of the line, currents and voltages are recorded for each arc fault generation at different distances. In the second step, a fault map trace is created by using signature coefficients obtained from Kirchhoff equations which allow a virtual decoupling of the line’s mutual capacitance. Each signature coefficient obtained from the subtraction of estimated currents is calculated taking into account the Discrete Fast Fourier Transform of currents and voltages and also the fault distance value. These parameters are then substituted into Kirchhoff equations. In a third step, the same procedure described previously to calculate signature coefficients is employed but this time by considering hypothetical fault distances where the fault can appear. In this step the fault distance is unknown. The iterative calculus from Kirchhoff equations considering stepped variations of the fault distance entails the obtaining of a curve with a linear trend. Finally, the fault distance location is estimated at the intersection of two curves obtained in steps 2 and 3. The series arc fault model is validated by comparing current registered from simulation with real recorded currents. The model of the complete circuit is obtained for a 49m line with a resistive load. Also, 11 different arc fault positions are considered for the map trace generation. By carrying out the complete simulation, the performance of the method and the perspectives of the work will be presented.Keywords: indoor power line, fault location, fault map trace, series arc fault
Procedia PDF Downloads 13744 Utilization of Standard Paediatric Observation Chart to Evaluate Infants under Six Months Presenting with Non-Specific Complaints
Authors: Michael Zhang, Nicholas Marriage, Valerie Astle, Marie-Louise Ratican, Jonathan Ash, Haddijatou Hughes
Abstract:
Objective: Young infants are often brought to the Emergency Department (ED) with a variety of complaints, some of them are non-specific and present as a diagnostic challenge to the attending clinician. Whilst invasive investigations such as blood tests and lumbar puncture are necessary in some cases to exclude serious infections, some basic clinical tools in additional to thorough clinical history can be useful to assess the risks of serious conditions in these young infants. This study aimed to examine the utilization of one of clinical tools in this regard. Methods: This retrospective observational study examined the medical records of infants under 6 months presenting to a mixed urban ED between January 2013 and December 2014. The infants deemed to have non-specific complaints or diagnoses by the emergency clinicians were selected for analysis. The ones with clear systemic diagnoses were excluded. Among all relevant clinical information and investigation results, utilization of Standard Paediatric Observation Chart (SPOC) was particularly scrutinized in these medical records. This specific chart was developed by the expert clinicians in local health department. It categorizes important clinical signs into some color-coded zones as a visual cue for serious implication of some abnormalities. An infant is regarded as SPOC positive when fulfills 1 red zone or 2 yellow zones criteria, and the attending clinician would be prompted to investigate and treat for potential serious conditions accordingly. Results: Eight hundred and thirty-five infants met the inclusion criteria for this project. The ones admitted to the hospital for further management were more likely to have SPOC positive criteria than the discharged infants (Odds ratio: 12.26, 95% CI: 8.04 – 18.69). Similarly, Sepsis alert criteria on SPOC were positive in a higher percentage of patients with serious infections (56.52%) in comparison to those with mild conditions (15.89%) (p < 0.001). The SPOC sepsis criteria had a sensitivity of 56.5% (95% CI: 47.0% - 65.7%) and a moderate specificity of 84.1% (95% CI: 80.8% - 87.0%) to identify serious infections. Applying to this infant population, with a 17.4% prevalence of serious infection, the positive predictive value was only 42.8% (95% CI: 36.9% - 49.0%). However, the negative predictive value was high at 90.2% (95% CI: 88.1% - 91.9%). Conclusions: Standard Paediatric Observation Chart has been applied as a useful clinical tool in the clinical practice to help identify and manage young sick infants in ED effectively.Keywords: clinical tool, infants, non-specific complaints, Standard Paediatric Observation Chart
Procedia PDF Downloads 25243 Knowledge Management Barriers: A Statistical Study of Hardware Development Engineering Teams within Restricted Environments
Authors: Nicholas S. Norbert Jr., John E. Bischoff, Christopher J. Willy
Abstract:
Knowledge Management (KM) is globally recognized as a crucial element in securing competitive advantage through building and maintaining organizational memory, codifying and protecting intellectual capital and business intelligence, and providing mechanisms for collaboration and innovation. KM frameworks and approaches have been developed and defined identifying critical success factors for conducting KM within numerous industries ranging from scientific to business, and for ranges of organization scales from small groups to large enterprises. However, engineering and technical teams operating within restricted environments are subject to unique barriers and KM challenges which cannot be directly treated using the approaches and tools prescribed for other industries. This research identifies barriers in conducting KM within Hardware Development Engineering (HDE) teams and statistically compares significance to barriers upholding the four KM pillars of organization, technology, leadership, and learning for HDE teams. HDE teams suffer from restrictions in knowledge sharing (KS) due to classification of information (national security risks), customer proprietary restrictions (non-disclosure agreement execution for designs), types of knowledge, complexity of knowledge to be shared, and knowledge seeker expertise. As KM evolved leveraging information technology (IT) and web-based tools and approaches from Web 1.0 to Enterprise 2.0, KM may also seek to leverage emergent tools and analytics including expert locators and hybrid recommender systems to enable KS across barriers of the technical teams. The research will test hypothesis statistically evaluating if KM barriers for HDE teams affect the general set of expected benefits of a KM System identified through previous research. If correlations may be identified, then generalizations of success factors and approaches may also be garnered for HDE teams. Expert elicitation will be conducted using a questionnaire hosted on the internet and delivered to a panel of experts including engineering managers, principal and lead engineers, senior systems engineers, and knowledge management experts. The feedback to the questionnaire will be processed using analysis of variance (ANOVA) to identify and rank statistically significant barriers of HDE teams within the four KM pillars. Subsequently, KM approaches will be recommended for upholding the KM pillars within restricted environments of HDE teams.Keywords: engineering management, knowledge barriers, knowledge management, knowledge sharing
Procedia PDF Downloads 27942 Noncovalent Antibody-Nanomaterial Conjugates: A Simple Approach to Produce Targeted Nanomedicines
Authors: Nicholas Fletcher, Zachary Houston, Yongmei Zhao, Christopher Howard, Kristofer Thurecht
Abstract:
One promising approach to enhance nanomedicine therapeutic efficacy is to include a targeting agent, such as an antibody, to increase accumulation at the tumor site. However, the application of such targeted nanomedicines remains limited, in part due to difficulties involved with biomolecule conjugation to synthetic nanomaterials. One approach recently developed to overcome this has been to engineer bispecific antibodies (BsAbs) with dual specificity, whereby one portion binds to methoxy polyethyleneglycol (mPEG) epitopes present on synthetic nanomedicines, while the other binds to molecular disease markers of interest. In this way, noncovalent complexes of nanomedicine core, comprising a hyperbranched polymer (HBP) of primarily mPEG, decorated with targeting ligands are able to be produced by simple mixing. Further work in this area has now demonstrated such complexes targeting the breast cancer marker epidermal growth factor receptor (EGFR) to show enhanced binding to tumor cells both in vitro and in vivo. Indeed the enhanced accumulation at the tumor site resulted in improved therapeutic outcomes compared to untargeted nanomedicines and free chemotherapeutics. The current work on these BsAb-HBP conjugates focuses on further probing antibody-nanomaterial interactions and demonstrating broad applicability to a range of cancer types. Herein are reported BsAb-HBP materials targeted towards prostate-specific membrane antigen (PSMA) and study of their behavior in vivo using ⁸⁹Zr positron emission tomography (PET) in a dual-tumor prostate cancer xenograft model. In this model mice bearing both PSMA+ and PSMA- tumors allow for PET imaging to discriminate between nonspecific and targeted uptake in tumors, and better quantify the increased accumulation following BsAb conjugation. Also examined is the potential for formation of these targeted complexes in situ following injection of individual components? The aim of this approach being to avoid undesirable clearance of proteinaceous complexes upon injection limiting available therapeutic. Ultimately these results demonstrate BsAb functionalized nanomaterials as a powerful and versatile approach for producing targeted nanomedicines for a variety of cancers.Keywords: bioengineering, cancer, nanomedicine, polymer chemistry
Procedia PDF Downloads 14141 Impacts of Commercial Honeybees on Native Butterflies in High-Elevation Meadows in Utah, USA
Authors: Jacqueline Kunzelman, Val Anderson, Robert Johnson, Nicholas Anderson, Rebecca Bates
Abstract:
In an effort to protect honeybees from colony collapse disorder, beekeepers are filing for government permits to use natural lands as summer pasture for honeybees under the multiple-use management regime in the United States. Utilizing natural landscapes in high mountain ranges may help strengthen honeybee colonies, as this natural setting is generally void of chemical pollutants and pesticides that are found in agricultural and urban settings. However, the introduction of a competitive species could greatly impact the native species occupying these natural landscapes. While honeybees and butterflies have different life histories, behavior, and foraging strategies, they compete for the same nectar resources. Few, if any, studies have focused on the potential population effects of commercial honeybees on native butterfly abundance and diversity. This study attempts to observe this impact using a paired before-after control-impact (BACI) design. Over the course of two years, malaise trap samples were collected every week during the months of the flowering season in two similar areas separated by 11 kilometers. Each area contained nine malaise trap sites for replication. In the first year, samples were taken to analyze and establish trends within the pollinating communities. In the second year, honeybees were introduced to only one of the two areas, and a change in trends between the two areas was assessed. Contrary to the original hypothesis, the resulting observation was an overall significant increase in the mean butterfly abundance in the impact areas after honeybees were introduced, while control areas remained relatively stable. This overall increase in abundance over the season can be attributed to an increase in butterflies during the first and second periods of the data collection when populations were near their peak. Several potential theories are 1) Honeybees are deterring a natural predator/competitor of butterflies that previously limited population growth. 2) Honeybees are consuming resources regularly used by butterflies, which may extend the foraging time and consequent capture rates of butterflies. 3) Environmental factors such as number of rainy days were inconsistent between control and impact areas, biasing capture rates. This ongoing research will help determine the suitability of high mountain ranges for the summer pasturing of honeybees and the population impacts on many different pollinators.Keywords: butterfly, competition, honeybee, pollinator
Procedia PDF Downloads 14640 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model
Procedia PDF Downloads 149