Search results for: Nicholas Okpe
39 Mitigating Self-Regulation Issues in the Online Instruction of Math
Authors: Robert Vanderburg, Michael Cowling, Nicholas Gibson
Abstract:
Mathematics is one of the core subjects taught in the Australian K-12 education system and is considered an important component for future studies in areas such as engineering and technology. In addition to this, Australia has been a world leader in distance education due to the vastness of its geographic landscape. Despite this, research is still needed on distance math instruction. Even though delivery of curriculum has given way to online studies, and there is a resultant push for computer-based (PC, tablet, smartphone) math instruction, much instruction still involves practice problems similar to those original curriculum packs, without the ability for students to self-regulate their learning using the full interactive capabilities of these devices. Given this need, this paper addresses issues students have during online instruction. This study consists of 32 students struggling with mathematics enrolled in a math tutorial conducted in an online setting. The study used a case study design to understand some of the blockades hindering the students’ success. Data was collected by tracking students practice and quizzes, tracking engagement of the site, recording one-on-one tutorials, and collecting data from interviews with the students. Results revealed that when students have cognitively straining tasks in an online instructional setting, the first thing to dissipate was their ability to self-regulate. The results also revealed that instructors could ameliorate the situation and provided useful data on strategies that could be used for designing future online tasks. Specifically, instructors could utilize cognitive dissonance strategies to reduce the cognitive drain of the tasks online. They could segment the instruction process to reduce the cognitive demands of the tasks and provide in-depth self-regulatory training, freeing mental capacity for the mathematics content. Finally, instructors could provide specific scheduling and assignment structure changes to reduce the amount of student centered self-regulatory tasks in the class. These findings will be discussed in more detail and summarized in a framework that can be used for future work.Keywords: digital education, distance education, mathematics education, self-regulation
Procedia PDF Downloads 13538 Periplasmic Expression of Anti-RoxP Antibody Fragments in Escherichia Coli.
Authors: Caspar S. Carson, Gabriel W. Prather, Nicholas E. Wong, Jeffery R. Anton, William H. McCoy
Abstract:
Cutibacterium acnes is a commensal bacterium found on human skin that has been linked to acne. C. acnes can also be an opportunistic pathogen when it infiltrates the body during surgery. This pathogen can cause dangerous infections of medical implants, such as shoulder replacements, leading to life-threatening blood infections. Compounding this issue, C. acnes resistance to many antibiotics has become an increasing problem worldwide, creating a need for special forms of treatment. C. acnes expresses the protein RoxP, and it requires this protein to colonize human skin. Though this protein is required for C. acnes skin colonization, its function is not yet understood. Inhibition of RoxP function might be an effective treatment for C. acnes infections. To develop such reagents, the McCoy Laboratory generated four unique anti-RoxP antibodies. Preliminary studies in the McCoy Lab have established that each antibody binds a distinct site on RoxP. To assess the potential of these antibodies as therapeutics, it is necessary to specifically characterize these antibody epitopes and evaluate them in assays that assess their ability to inhibit RoxP-dependent C. acnes growth. To provide material for these studies, an antibody expression construct, Fv-clasp(v2), was adapted to encode anti-RoxP antibody sequences. The author hypothesizes that this expression strategy can produce sufficient amounts of >95% pure antibody fragments for further characterization of these antibodies. Four anti-RoxP Fv-clasp(v2) expression constructs (pET vector-based) were transformed into E. coli BL21-Gold(DE3) cells and a small-scale expression and purification trial was performed for each construct to evaluate anti-RoxP Fv-clasp(v2) yield and purity. Successful expression and purification of these antibody constructs will allow for their use in structural studies, such as protein crystallography and cryogenic electron microscopy. Such studies would help to define the antibody binding sites on RoxP, which could then be leveraged in the development of certain methods to treat C. acnes infection through RoxP inhibition.Keywords: structural biology, protein expression, infectious disease, antibody, therapeutics, E. coli
Procedia PDF Downloads 6037 Identification of Candidate Congenital Heart Defects Biomarkers by Applying a Random Forest Approach on DNA Methylation Data
Authors: Kan Yu, Khui Hung Lee, Eben Afrifa-Yamoah, Jing Guo, Katrina Harrison, Jack Goldblatt, Nicholas Pachter, Jitian Xiao, Guicheng Brad Zhang
Abstract:
Background and Significance of the Study: Congenital Heart Defects (CHDs) are the most common malformation at birth and one of the leading causes of infant death. Although the exact etiology remains a significant challenge, epigenetic modifications, such as DNA methylation, are thought to contribute to the pathogenesis of congenital heart defects. At present, no existing DNA methylation biomarkers are used for early detection of CHDs. The existing CHD diagnostic techniques are time-consuming and costly and can only be used to diagnose CHDs after an infant was born. The present study employed a machine learning technique to analyse genome-wide methylation data in children with and without CHDs with the aim to find methylation biomarkers for CHDs. Methods: The Illumina Human Methylation EPIC BeadChip was used to screen the genome‐wide DNA methylation profiles of 24 infants diagnosed with congenital heart defects and 24 healthy infants without congenital heart defects. Primary pre-processing was conducted by using RnBeads and limma packages. The methylation levels of top 600 genes with the lowest p-value were selected and further investigated by using a random forest approach. ROC curves were used to analyse the sensitivity and specificity of each biomarker in both training and test sample sets. The functionalities of selected genes with high sensitivity and specificity were then assessed in molecular processes. Major Findings of the Study: Three genes (MIR663, FGF3, and FAM64A) were identified from both training and validating data by random forests with an average sensitivity and specificity of 85% and 95%. GO analyses for the top 600 genes showed that these putative differentially methylated genes were primarily associated with regulation of lipid metabolic process, protein-containing complex localization, and Notch signalling pathway. The present findings highlight that aberrant DNA methylation may play a significant role in the pathogenesis of congenital heart defects.Keywords: biomarker, congenital heart defects, DNA methylation, random forest
Procedia PDF Downloads 15736 Autophagy Defects That Modify Human Immune Cell Metabolism and Promote Aging-Associated Inflammation
Authors: Grace McCambridge, Alanna Keady, Madhur Agrawal, Dequina Nicholas Alvarado, Barbara Nikolajczyk, Leena Panneerseelan-Bharath
Abstract:
Age is a non-modifiable risk factor for the inflammation that underlies pathologies such as type 2 diabetes mellitus (T2DM). Inflammation, as indicated by circulating cytokines, rises in aging, but mechanisms that promote this ‘inflammaging’ remain poorly defined. Furthermore, downstream consequences of inflammaging, including the development of an inflammatory profile that predicts comorbidities like T2DM, remain speculative. We tested the possibility that natural aging-associated changes in autophagy, a process that is compromised in both aging and T2DM, regulates inflammatory profiles in older subjects. Our data showed that circulating CD4⁺ T cells from older compared to younger subjects have (i) defects in autophagy; (ii) higher mitochondria accumulation; (iii) a failure to metabolically shift from oxidative phosphorylation to anaerobic glycolysis upon αCD3/CD28 activation; (iv) more reactive oxygen species (ROS) accumulation; and (v) a cytokine profile that recapitulates the Th17 profile that predicts T2DM. ROS scavenging in cells from older subjects restored mitochondrial mass and membrane potential (indicators of improved autophagy) and reduced Th17 cytokines to amounts made by T cells from younger subjects. Knock-down of the autophagy protein Atg3 in T cells from younger subjects increased mitochondrial accumulation and Th17 cytokines. To begin translating these findings to clinical practice, we showed that physiological concentrations of the diabetes drug metformin (100 µM) added in vitro enhanced autophagy, prevented mitochondria and ROS accumulation, increased anaerobic glycolysis, and decreased Th17 cytokines in activated CD4⁺ T cells from older subjects. Metformin therefore improves autophagy and multiple downstream pro-inflammatory mechanisms CD4⁺ T cells from older subjects. We conclude that autophagy improvement ameliorates the development of a T2DM-predictive Th17 profile in aging, and thus holds promise for delay or prevention of aging-associated metabolic decline.Keywords: autophagy, mitochondrial turnover, ROS, glycolysis
Procedia PDF Downloads 16335 Adsorption and Desorption Behavior of Ionic and Nonionic Surfactants on Polymer Surfaces
Authors: Giulia Magi Meconi, Nicholas Ballard, José M. Asua, Ronen Zangi
Abstract:
Experimental and computational studies are combined to elucidate the adsorption proprieties of ionic and nonionic surfactants on hydrophobic polymer surface such us poly(styrene). To present these two types of surfactants, sodium dodecyl sulfate and poly(ethylene glycol)-block-poly(ethylene), commonly utilized in emulsion polymerization, are chosen. By applying quartz crystal microbalance with dissipation monitoring it is found that, at low surfactant concentrations, it is easier to desorb (as measured by rate) ionic surfactants than nonionic surfactants. From molecular dynamics simulations, the effective, attractive force of these nonionic surfactants to the surface increases with the decrease of their concentration, whereas, the ionic surfactant exhibits mildly the opposite trend. The contrasting behavior of ionic and nonionic surfactants critically relies on two observations obtained from the simulations. The first is that there is a large degree of interweavement between head and tails groups in the adsorbed layer formed by the nonionic surfactant (PEO/PE systems). The second is that water molecules penetrate this layer. In the disordered layer, these nonionic surfactants generate at the surface, only oxygens of the head groups present at the interface with the water phase or oxygens next to the penetrating waters can form hydrogen bonds. Oxygens inside this layer lose this favorable energy, with a magnitude that increases with the surfactants density at the interface. This reduced stability of the surfactants diminishes their driving force for adsorption. All that is shown to be in accordance with experimental results on the dynamics of surfactants desorption. Ionic surfactants assemble into an ordered structure and the attraction to the surface was even slightly augmented at higher surfactant concentration, in agreement with the experimentally determined adsorption isotherm. The reason these two types of surfactants behave differently is because the ionic surfactant has a small head group that is strongly hydrophilic, whereas the head groups of the nonionic surfactants are large and only weakly attracted to water.Keywords: emulsion polymerization process, molecular dynamics simulations, polymer surface, surfactants adsorption
Procedia PDF Downloads 34134 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study
Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy
Abstract:
Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy
Procedia PDF Downloads 11933 Maresin Like 1 Treatment: Curbing the Pathogenesis of Behavioral Dysfunction and Neurodegeneration in Alzheimer's Disease Mouse Model
Authors: Yan Lu, Song Hong, Janakiraman Udaiyappan, Aarti Nagayach, Quoc-Viet A. Duong, Masao Morita, Shun Saito, Yuichi Kobayashi, Yuhai, Zhao, Hongying Peng, Nicholas B. Pham, Walter J Lukiw, Christopher A. Vuong, Nicolas G. Bazan
Abstract:
Aims: Neurodegeneration and behavior dysfunction occurs in patients with Alzheimer's Disease (AD), and as the disease progresses many patients develop cognitive impairment. 5XFAD mouse model of AD is widely used to study AD pathogenesis and treatment. This study aimed to investigate the effect of maresin like 1 (MaR-L1) treatment in AD pathology using 5XFAD mice. Methods: We tested 12-month-old male 5XFAD mice and wild type control mice treated with MaR-L1 in a battery of behavioral tasks. We performed open field test, beam walking test, clasping test, inverted grid test, acetone test, marble burring test, elevated plus maze test, cross maze test and novel object recognition test. We also studied neuronal loss, amyloid β burden, and inflammation in the brains of 5XFAD mice using immunohistology and Western blotting. Results: MaR-L1 treatment to the 5XFAD mice showed improved cognitive function of 5XFAD mice. MaR-L1 showed decreased anxiety behavior in open field test and marble burring test, increased muscular strength in the beam walking test, clasping test and inverted grid test. Cognitive function was improved in MaR-L1 treated 5XFAD mice in the novel object recognition test. MaR-L1 prevented neuronal loss and aberrant inflammation. Conclusion: Our finding suggests that behavioral abnormalities were normalized by the administration of MaR-L1 and the neuroprotective role of MaR-L1 in the AD. It also indicates that MaR-L1 treatment is able to prevent and or ameliorate neuronal loss and aberrant inflammation. Further experiments to validate the results are warranted using other AD models in the future.Keywords: Alzheimer's disease, motor and cognitive behavior, 5XFAD mice, Maresin Like 1, microglial cell, astrocyte, neurodegeneration, inflammation, resolution of inflammation
Procedia PDF Downloads 17832 Effect of Repellent Coatings, Aerosol Protective Liners, and Lamination on the Properties of Chemical/Biological Protective Textiles
Authors: Natalie Pomerantz, Nicholas Dugan, Molly Richards, Walter Zukas
Abstract:
The primary research question to be answered for Chemical/Biological (CB) protective clothing, is how to protect wearers from a range of chemical and biological threats in liquid, vapor, and aerosol form, while reducing the thermal burden. Currently, CB protective garments are hot, heavy, and wearers are limited by short work times in order to prevent heat injury. This study demonstrates how to incorporate different levels of protection on a material level and modify fabric composites such that the thermal burden is reduced to such an extent it approaches that of a standard duty uniform with no CB protection. CB protective materials are usually comprised of several fabric layers: a cover fabric with a liquid repellent coating, a protective layer which is comprised of a carbon-based sorptive material or semi-permeable membrane, and a comfort next-to-skin liner. In order to reduce thermal burden, all of these layers were laminated together to form one fabric composite which had no insulative air gap in between layers. However, the elimination of the air gap also reduced the CB protection of the fabric composite. In order to increase protection in the laminated composite, different nonwoven aerosol protective liners were added, and a super repellent coating was applied to the cover fabric, prior to lamination. Different adhesive patterns were investigated to determine the durability of the laminate with the super repellent coating, and the effect on air permeation. After evaluating the thermal properties, textile properties and protective properties of the iterations of these fabric composites, it was found that the thermal burden of these materials was greatly reduced by decreasing the thermal resistance with the elimination of the air gap between layers. While the level of protection was reduced in laminate composites, the addition of a super repellent coating increased protection towards low volatility agents without impacting thermal burden. Similarly, the addition of aerosol protective liner increased protection without reducing water vapor transport, depending on the nonwoven used, however, the air permeability was significantly decreased. The balance of all these properties and exploration of the trade space between thermal burden and protection will be discussed.Keywords: aerosol protection, CBRNe protection, lamination, nonwovens, repellent coatings, thermal burden
Procedia PDF Downloads 36231 Utilization of Standard Paediatric Observation Chart to Evaluate Infants under Six Months Presenting with Non-Specific Complaints
Authors: Michael Zhang, Nicholas Marriage, Valerie Astle, Marie-Louise Ratican, Jonathan Ash, Haddijatou Hughes
Abstract:
Objective: Young infants are often brought to the Emergency Department (ED) with a variety of complaints, some of them are non-specific and present as a diagnostic challenge to the attending clinician. Whilst invasive investigations such as blood tests and lumbar puncture are necessary in some cases to exclude serious infections, some basic clinical tools in additional to thorough clinical history can be useful to assess the risks of serious conditions in these young infants. This study aimed to examine the utilization of one of clinical tools in this regard. Methods: This retrospective observational study examined the medical records of infants under 6 months presenting to a mixed urban ED between January 2013 and December 2014. The infants deemed to have non-specific complaints or diagnoses by the emergency clinicians were selected for analysis. The ones with clear systemic diagnoses were excluded. Among all relevant clinical information and investigation results, utilization of Standard Paediatric Observation Chart (SPOC) was particularly scrutinized in these medical records. This specific chart was developed by the expert clinicians in local health department. It categorizes important clinical signs into some color-coded zones as a visual cue for serious implication of some abnormalities. An infant is regarded as SPOC positive when fulfills 1 red zone or 2 yellow zones criteria, and the attending clinician would be prompted to investigate and treat for potential serious conditions accordingly. Results: Eight hundred and thirty-five infants met the inclusion criteria for this project. The ones admitted to the hospital for further management were more likely to have SPOC positive criteria than the discharged infants (Odds ratio: 12.26, 95% CI: 8.04 – 18.69). Similarly, Sepsis alert criteria on SPOC were positive in a higher percentage of patients with serious infections (56.52%) in comparison to those with mild conditions (15.89%) (p < 0.001). The SPOC sepsis criteria had a sensitivity of 56.5% (95% CI: 47.0% - 65.7%) and a moderate specificity of 84.1% (95% CI: 80.8% - 87.0%) to identify serious infections. Applying to this infant population, with a 17.4% prevalence of serious infection, the positive predictive value was only 42.8% (95% CI: 36.9% - 49.0%). However, the negative predictive value was high at 90.2% (95% CI: 88.1% - 91.9%). Conclusions: Standard Paediatric Observation Chart has been applied as a useful clinical tool in the clinical practice to help identify and manage young sick infants in ED effectively.Keywords: clinical tool, infants, non-specific complaints, Standard Paediatric Observation Chart
Procedia PDF Downloads 25230 Knowledge Management Barriers: A Statistical Study of Hardware Development Engineering Teams within Restricted Environments
Authors: Nicholas S. Norbert Jr., John E. Bischoff, Christopher J. Willy
Abstract:
Knowledge Management (KM) is globally recognized as a crucial element in securing competitive advantage through building and maintaining organizational memory, codifying and protecting intellectual capital and business intelligence, and providing mechanisms for collaboration and innovation. KM frameworks and approaches have been developed and defined identifying critical success factors for conducting KM within numerous industries ranging from scientific to business, and for ranges of organization scales from small groups to large enterprises. However, engineering and technical teams operating within restricted environments are subject to unique barriers and KM challenges which cannot be directly treated using the approaches and tools prescribed for other industries. This research identifies barriers in conducting KM within Hardware Development Engineering (HDE) teams and statistically compares significance to barriers upholding the four KM pillars of organization, technology, leadership, and learning for HDE teams. HDE teams suffer from restrictions in knowledge sharing (KS) due to classification of information (national security risks), customer proprietary restrictions (non-disclosure agreement execution for designs), types of knowledge, complexity of knowledge to be shared, and knowledge seeker expertise. As KM evolved leveraging information technology (IT) and web-based tools and approaches from Web 1.0 to Enterprise 2.0, KM may also seek to leverage emergent tools and analytics including expert locators and hybrid recommender systems to enable KS across barriers of the technical teams. The research will test hypothesis statistically evaluating if KM barriers for HDE teams affect the general set of expected benefits of a KM System identified through previous research. If correlations may be identified, then generalizations of success factors and approaches may also be garnered for HDE teams. Expert elicitation will be conducted using a questionnaire hosted on the internet and delivered to a panel of experts including engineering managers, principal and lead engineers, senior systems engineers, and knowledge management experts. The feedback to the questionnaire will be processed using analysis of variance (ANOVA) to identify and rank statistically significant barriers of HDE teams within the four KM pillars. Subsequently, KM approaches will be recommended for upholding the KM pillars within restricted environments of HDE teams.Keywords: engineering management, knowledge barriers, knowledge management, knowledge sharing
Procedia PDF Downloads 27929 Noncovalent Antibody-Nanomaterial Conjugates: A Simple Approach to Produce Targeted Nanomedicines
Authors: Nicholas Fletcher, Zachary Houston, Yongmei Zhao, Christopher Howard, Kristofer Thurecht
Abstract:
One promising approach to enhance nanomedicine therapeutic efficacy is to include a targeting agent, such as an antibody, to increase accumulation at the tumor site. However, the application of such targeted nanomedicines remains limited, in part due to difficulties involved with biomolecule conjugation to synthetic nanomaterials. One approach recently developed to overcome this has been to engineer bispecific antibodies (BsAbs) with dual specificity, whereby one portion binds to methoxy polyethyleneglycol (mPEG) epitopes present on synthetic nanomedicines, while the other binds to molecular disease markers of interest. In this way, noncovalent complexes of nanomedicine core, comprising a hyperbranched polymer (HBP) of primarily mPEG, decorated with targeting ligands are able to be produced by simple mixing. Further work in this area has now demonstrated such complexes targeting the breast cancer marker epidermal growth factor receptor (EGFR) to show enhanced binding to tumor cells both in vitro and in vivo. Indeed the enhanced accumulation at the tumor site resulted in improved therapeutic outcomes compared to untargeted nanomedicines and free chemotherapeutics. The current work on these BsAb-HBP conjugates focuses on further probing antibody-nanomaterial interactions and demonstrating broad applicability to a range of cancer types. Herein are reported BsAb-HBP materials targeted towards prostate-specific membrane antigen (PSMA) and study of their behavior in vivo using ⁸⁹Zr positron emission tomography (PET) in a dual-tumor prostate cancer xenograft model. In this model mice bearing both PSMA+ and PSMA- tumors allow for PET imaging to discriminate between nonspecific and targeted uptake in tumors, and better quantify the increased accumulation following BsAb conjugation. Also examined is the potential for formation of these targeted complexes in situ following injection of individual components? The aim of this approach being to avoid undesirable clearance of proteinaceous complexes upon injection limiting available therapeutic. Ultimately these results demonstrate BsAb functionalized nanomaterials as a powerful and versatile approach for producing targeted nanomedicines for a variety of cancers.Keywords: bioengineering, cancer, nanomedicine, polymer chemistry
Procedia PDF Downloads 14128 Impacts of Commercial Honeybees on Native Butterflies in High-Elevation Meadows in Utah, USA
Authors: Jacqueline Kunzelman, Val Anderson, Robert Johnson, Nicholas Anderson, Rebecca Bates
Abstract:
In an effort to protect honeybees from colony collapse disorder, beekeepers are filing for government permits to use natural lands as summer pasture for honeybees under the multiple-use management regime in the United States. Utilizing natural landscapes in high mountain ranges may help strengthen honeybee colonies, as this natural setting is generally void of chemical pollutants and pesticides that are found in agricultural and urban settings. However, the introduction of a competitive species could greatly impact the native species occupying these natural landscapes. While honeybees and butterflies have different life histories, behavior, and foraging strategies, they compete for the same nectar resources. Few, if any, studies have focused on the potential population effects of commercial honeybees on native butterfly abundance and diversity. This study attempts to observe this impact using a paired before-after control-impact (BACI) design. Over the course of two years, malaise trap samples were collected every week during the months of the flowering season in two similar areas separated by 11 kilometers. Each area contained nine malaise trap sites for replication. In the first year, samples were taken to analyze and establish trends within the pollinating communities. In the second year, honeybees were introduced to only one of the two areas, and a change in trends between the two areas was assessed. Contrary to the original hypothesis, the resulting observation was an overall significant increase in the mean butterfly abundance in the impact areas after honeybees were introduced, while control areas remained relatively stable. This overall increase in abundance over the season can be attributed to an increase in butterflies during the first and second periods of the data collection when populations were near their peak. Several potential theories are 1) Honeybees are deterring a natural predator/competitor of butterflies that previously limited population growth. 2) Honeybees are consuming resources regularly used by butterflies, which may extend the foraging time and consequent capture rates of butterflies. 3) Environmental factors such as number of rainy days were inconsistent between control and impact areas, biasing capture rates. This ongoing research will help determine the suitability of high mountain ranges for the summer pasturing of honeybees and the population impacts on many different pollinators.Keywords: butterfly, competition, honeybee, pollinator
Procedia PDF Downloads 14627 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model
Procedia PDF Downloads 14726 Farmers Willingness to Pay for Irrigated Maize Production in Rural Kenya
Authors: Dennis Otieno, Lilian Kirimi Nicholas Odhiambo, Hillary Bii
Abstract:
Kenya is considered to be a middle level income country and usuaaly does not meet household food security needs especially in North and South eastern parts. Approximately half of the population is living under the poverty line (www, CIA 1, 2012). Agriculture is the largest sector in the country, employing 80% of the population. These are thereby directly dependent on the sufficiency of outputs received. This makes efficient, easy-accessible and cheap agricultural practices an important matter in order to improve food security. Maize is the prime staple food commodity in Kenya and represents a substantial share of people’s nutritional intake. This study is the result of questionnaire based interviews, Key informant and focus group discussion involving 220 small scale maize farmers Kenyan. The study was located to two separated areas; Lower Kuja, Bunyala, Nandi, Lower Nzoia, Perkerra, Mwea Bura, Hola and Galana Kulalu in Kenya. The questionnaire captured the farmers’ use and perceived importance of the use irrigation services and irrigated maize production. Viability was evaluated using the four indices which were all positive with NPV giving positive cash flows in less than 21 years at most for one season output. The mean willingness to pay was found to be KES 3082 and willingness to pay increased with increase in irrigation premiums. The economic value of water was found to be greater than the willingness to pay implying that irrigated maize production is sustainable. Farmers stated that viability was influenced by high output levels, good produce quality, crop of choice, availability of sufficient water and enforcement the last two factors had a positive influence while the other had negative effect on the viability of irrigated maize. A regression was made over the correlation between the willingness to pay for irrigated maize production using scheme and plot level factors. Farmers that already use other inputs such as animal manure, hired labor and chemical fertilizer should also have a demand for improved seeds according to Liebig's law of minimum and expansion path theory. The regression showed that premiums, and high yields have a positive effect on willingness to pay while produce quality, efficient fertilizer use, and crop season have a negative effect.Keywords: maize, food security, profits, sustainability, willingness to pay
Procedia PDF Downloads 22025 Demonstrating the Efficacy of a Low-Cost Carbon Dioxide-Based Cryoablation Device in Veterinary Medicine for Translation to Third World Medical Applications
Authors: Grace C. Kuroki, Yixin Hu, Bailey Surtees, Rebecca Krimins, Nicholas J. Durr, Dara L. Kraitchman
Abstract:
The purpose of this study was to perform a Phase I veterinary clinical trial with a low-cost, carbon-dioxide-based, passive thaw cryoablation device as proof-of-principle for application in pets and translation to third-world treatment of breast cancer. This study was approved by the institutional animal care and use committee. Client-owned dogs with subcutaneous masses, primarily lipomas or mammary cancers, were recruited for the study. Inclusion was based on clinical history, lesion location, preanesthetic blood work, and fine needle aspirate or biopsy confirmation of mass. Informed consent was obtained from the owners for dogs that met inclusion criteria. Ultrasound assessment of mass extent was performed immediately prior to mass cryoablation. Dogs were placed under general anesthesia and sterilely prepared. A stab incision was created to insert a custom 4.19 OD x 55.9 mm length cryoablation probe (Kubanda Cryotherapy) into the mass. Originally designed for treating breast cancer in low resource settings, this device has demonstrated potential in effectively necrosing subcutaneous masses. A dose escalation study of increasing freeze-thaw cycles (5/4/5, 7/5/7, and 10/7/10 min) was performed to assess the size of the iceball/necrotic extent of cryoablation. Each dog was allowed to recover for ~1-2 weeks before surgical removal of the mass. A single mass was treated in seven dogs (2 mammary masses, a sarcoma, 4 lipomas, and 1 adnexal mass) with most masses exceeding 2 cm in any dimension. Mass involution was most evident in the malignant mammary and adnexal mass. Lipomas showed minimal shrinkage prior to surgical removal, but an area of necrosis was evident along the cryoablation probe path. Gross assessment indicated a clear margin of cryoablation along the cryoprobe independent of tumor type. Detailed histopathology is pending, but complete involution of large lipomas appeared to be unlikely with a 10/7/10 protocol. The low-cost, carbon dioxide-based cryotherapy device permits a minimally invasive technique that may be useful for veterinary applications but is also informative of the unlikely resolution of benign adipose breast masses that may be encountered in third world countries.Keywords: cryoablation, cryotherapy, interventional oncology, veterinary technology
Procedia PDF Downloads 13124 Improving Screening and Treatment of Binge Eating Disorders in Pediatric Weight Management Clinic through a Quality Improvement Framework
Authors: Cristina Fernandez, Felix Amparano, John Tumberger, Stephani Stancil, Sarah Hampl, Brooke Sweeney, Amy R. Beck, Helena H Laroche, Jared Tucker, Eileen Chaves, Sara Gould, Matthew Lindquist, Lora Edwards, Renee Arensberg, Meredith Dreyer, Jazmine Cedeno, Alleen Cummins, Jennifer Lisondra, Katie Cox, Kelsey Dean, Rachel Perera, Nicholas A. Clark
Abstract:
Background: Adolescents with obesity are at higher risk of disordered eating than the general population. Detection of eating disorders (ED) is difficult. Screening questionnaires may aid in early detection of ED. Our team’s prior efforts focused on increasing ED screening rates to ≥90% using a validated 10-question adolescent binge eating disorder screening questionnaire (ADO-BED). This aim was achieved. We then aimed to improve treatment plan initiation of patients ≥12 years of age who screen positive for BED within our WMC from 33% to 70% within 12 months. Methods: Our WMC is within a tertiary-care, free-standing children’s hospital. A3, an improvement framework, was used. A multidisciplinary team (physicians, nurses, registered dietitians, psychologists, and exercise physiologists) was created. The outcome measure was documentation of treatment plan initiation of those who screen positive (goal 70%). The process measure was ADO-BED screening rate of WMC patients (goal ≥90%). Plan-Do-Study-Act (PDSA) cycle 1 included provider education on current literature and treatment plan initiation based upon ADO-BED responses. PDSA 2 involved increasing documentation of treatment plan and retrain process to providers. Pre-defined treatment plans were: 1) repeat screen in 3-6 months, 2) resources provided only, or 3) comprehensive multidisciplinary weight management team evaluation. Run charts monitored impact over time. Results: Within 9 months, 166 patients were seen in WMC. Process measure showed sustained performance above goal (mean 98%). Outcome measure showed special cause improvement from mean of 33% to 100% (n=31). Of treatment plans provided, 45% received Plan 1, 4% Plan 2, and 46% Plan 3. Conclusion: Through a multidisciplinary improvement team approach, we maintained sustained ADO-BED screening performance, and, prior to our 12-month timeline, achieved our project aim. Our efforts may serve as a model for other multidisciplinary WMCs. Next steps may include expanding project scope to other WM programs.Keywords: obesity, pediatrics, clinic, eating disorder
Procedia PDF Downloads 6023 Predictors of Pericardial Effusion Requiring Drainage Following Coronary Artery Bypass Graft Surgery: A Retrospective Analysis
Authors: Nicholas McNamara, John Brookes, Michael Williams, Manish Mathew, Elizabeth Brookes, Tristan Yan, Paul Bannon
Abstract:
Objective: Pericardial effusions are an uncommon but potentially fatal complication after cardiac surgery. The goal of this study was to describe the incidence and risk factors associated with the development of pericardial effusion requiring drainage after coronary artery bypass graft surgery (CABG). Methods: A retrospective analysis was undertaken using prospectively collected data. All adult patients who underwent CABG at our institution between 1st January 2017 and 31st December 2018 were included. Pericardial effusion was diagnosed using transthoracic echocardiography (TTE) performed for clinical suspicion of pre-tamponade or tamponade. Drainage was undertaken if considered clinically necessary and performed via a sub-xiphoid incision, pericardiocentesis, or via re-sternotomy at the discretion of the treating surgeon. Patient demographics, operative characteristics, anticoagulant exposure, and postoperative outcomes were examined to identify those variables associated with the development of pericardial effusion requiring drainage. Tests of association were performed using the Fischer exact test for dichotomous variables and the Student t-test for continuous variables. Logistic regression models were used to determine univariate predictors of pericardial effusion requiring drainage. Results: Between January 1st, 2017, and December 31st, 2018, a total of 408 patients underwent CABG at our institution, and eight (1.9%) required drainage of pericardial effusion. There was no difference in age, gender, or the proportion of patients on preoperative therapeutic heparin between the study and control groups. Univariate analysis identified preoperative atrial arrhythmia (37.5% vs 8.8%, p = 0.03), reduced left ventricular ejection fraction (47% vs 56%, p = 0.04), longer cardiopulmonary bypass (130 vs 84 min, p < 0.01) and cross-clamp (107 vs 62 min, p < 0.01) times, higher drain output in the first four postoperative hours (420 vs 213 mL, p <0.01), postoperative atrial fibrillation (100% vs 32%, p < 0.01), and pleural effusion requiring drainage (87.5% vs 12.5%, p < 0.01) to be associated with development of pericardial effusion requiring drainage. Conclusion: In this study, the incidence of pericardial effusion requiring drainage was 1.9%. Several factors, mainly related to preoperative or postoperative arrhythmia, length of surgery, and pleural effusion requiring drainage, were identified to be associated with developing clinically significant pericardial effusions. High clinical suspicion and low threshold for transthoracic echo are pertinent to ensure this potentially lethal condition is not missed.Keywords: coronary artery bypass, pericardial effusion, pericardiocentesis, tamponade, sub-xiphoid drainage
Procedia PDF Downloads 16022 Virtual Metering and Prediction of Heating, Ventilation, and Air Conditioning Systems Energy Consumption by Using Artificial Intelligence
Authors: Pooria Norouzi, Nicholas Tsang, Adam van der Goes, Joseph Yu, Douglas Zheng, Sirine Maleej
Abstract:
In this study, virtual meters will be designed and used for energy balance measurements of an air handling unit (AHU). The method aims to replace traditional physical sensors in heating, ventilation, and air conditioning (HVAC) systems with simulated virtual meters. Due to the inability to manage and monitor these systems, many HVAC systems have a high level of inefficiency and energy wastage. Virtual meters are implemented and applied in an actual HVAC system, and the result confirms the practicality of mathematical sensors for alternative energy measurement. While most residential buildings and offices are commonly not equipped with advanced sensors, adding, exploiting, and monitoring sensors and measurement devices in the existing systems can cost thousands of dollars. The first purpose of this study is to provide an energy consumption rate based on available sensors and without any physical energy meters. It proves the performance of virtual meters in HVAC systems as reliable measurement devices. To demonstrate this concept, mathematical models are created for AHU-07, located in building NE01 of the British Columbia Institute of Technology (BCIT) Burnaby campus. The models will be created and integrated with the system’s historical data and physical spot measurements. The actual measurements will be investigated to prove the models' accuracy. Based on preliminary analysis, the resulting mathematical models are successful in plotting energy consumption patterns, and it is concluded confidently that the results of the virtual meter will be close to the results that physical meters could achieve. In the second part of this study, the use of virtual meters is further assisted by artificial intelligence (AI) in the HVAC systems of building to improve energy management and efficiency. By the data mining approach, virtual meters’ data is recorded as historical data, and HVAC system energy consumption prediction is also implemented in order to harness great energy savings and manage the demand and supply chain effectively. Energy prediction can lead to energy-saving strategies and considerations that can open a window in predictive control in order to reach lower energy consumption. To solve these challenges, the energy prediction could optimize the HVAC system and automates energy consumption to capture savings. This study also investigates AI solutions possibility for autonomous HVAC efficiency that will allow quick and efficient response to energy consumption and cost spikes in the energy market.Keywords: virtual meters, HVAC, artificial intelligence, energy consumption prediction
Procedia PDF Downloads 10421 A Systematic Review of Efficacy and Safety of Radiofrequency Ablation in Patients with Spinal Metastases
Authors: Pascale Brasseur, Binu Gurung, Nicholas Halfpenny, James Eaton
Abstract:
Development of minimally invasive treatments in recent years provides a potential alternative to invasive surgical interventions which are of limited value to patients with spinal metastases due to short life expectancy. A systematic review was conducted to explore the efficacy and safety of radiofrequency ablation (RFA), a minimally invasive treatment in patients with spinal metastases. EMBASE, Medline and CENTRAL were searched from database inception to March 2017 for randomised controlled trials (RCTs) and non-randomised studies. Conference proceedings for ASCO and ESMO published in 2015 and 2016 were also searched. Fourteen studies were included: three prospective interventional studies, four prospective case series and seven retrospective case series. No RCTs or studies comparing RFA with another treatment were identified. RFA was followed by cement augmentation in all patients in seven studies and some patients (40-96%) in the remaining seven studies. Efficacy was assessed as pain relief in 13/14 studies with the use of a numerical rating scale (NRS) or a visual analogue scale (VAS) at various time points. Ten of the 13 studies reported a significant decrease in pain outcome, post-RFA compared to baseline. NRS scores improved significantly at 1 week (5.9 to 3.5, p < 0.0001; 8 to 4.3, p < 0.02 and 8 to 3.9, p < 0.0001) and this improvement was maintained at 1 month post-RFA compared to baseline (5.9 to 2.6, p < 0.0001; 8 to 2.9, p < 0.0003; 8 to 2.9, p < 0.0001). Similarly, VAS scores decreased significantly at 1 week (7.5 to 2.7, p=0.00005; 7.51 to 1.73, p < 0.0001; 7.82 to 2.82, p < 0.001) and this pattern was maintained at 1 month post-RFA compared to baseline (7.51 to 2.25, p < 0.0001; 7.82 to 3.3; p < 0.001). A significant pain relief was achieved regardless of whether patients had cement augmentation in two studies assessing the impact of RFA with or without cement augmentation on VAS pain scores. In these two studies, a significant decrease in pain scores was reported for patients receiving RFA alone and RFA+cement at 1 week (4.3 to 1.7. p=0.0004 and 6.6 to 1.7, p=0.003 respectively) and 15-36 months (7.9 to 4, p=0.008 and 7.6 to 3.5, p=0.005 respectively) after therapy. Few minor complications were reported and these included neural damage, radicular pain, vertebroplasty leakage and lower limb pain/numbness. In conclusion, the efficacy and safety of RFA were consistently positive between prospective and retrospective studies with reductions in pain and few procedural complications. However, the lack of control groups in the identified studies indicates the possibility of selection bias inherent in single arm studies. Controlled trials exploring efficacy and safety of RFA in patients with spinal metastases are warranted to provide robust evidence. The identified studies provide an initial foundation for such future trials.Keywords: pain relief, radiofrequency ablation, spinal metastases, systematic review
Procedia PDF Downloads 17320 US Track And Field System: Examining Micro-Level Practices against a Global Model for Integrated Development of Mass and Elite Sport
Authors: Peter Smolianov, Steven Dion, Christopher Schoen, Jaclyn Norberg, Nicholas Stone, Soufiane Rafi
Abstract:
This study assessed the micro-level elements of track and field development in the US against a model for integrating high-performance sport with mass participation. This investigation is important for the country’s international sport performance, which declined relative to other countries and wellbeing, which in its turn deteriorated as over half of the US population became overweight. A questionnaire was designed for the following elements of the model: talent identification and development as well as advanced athlete support. Survey questions were validated by 12 experts, including academics, executives from sport governing bodies, coaches, and administrators. To determine the areas for improvement, the questionnaires were completed by 102 US track and field coaches representing the country’s regions and coaching levels. Possible advancements were further identified through semi-structured discussions with 10 US track and field administrators. The study found that talent search and development is a critically important area for improvement: 49 percent of respondents had overall negative perceptions, and only 16 percent were positive regarding these US track and field practices. Both quantitative survey results and open responses revealed that the key reason for the inadequate athlete development was a shortage of well-educated and properly paid coaches: 77 percent of respondents indicated that coach expertise is never or rarely high across all participant ages and levels. More than 40 percent of the respondents were uncertain of or not familiar with world’s best talent identification and development practices, particularly methods of introducing children to track and field from outside the sport’s participation base. Millions more could be attracted to the sport by adopting best international practices. First, physical education should be offered a minimum three times a week in all school grades, and track and field together with other healthy sports, should be taught at school to all children. Second, multi-sport events, including track and field disciplines, should be organized for everyone within and among all schools, cities and regions. Three, Australian and Eastern European methods of talent search at schools should be utilized and tailored to the US conditions. Four, comprehensive long term athlete development guidelines should be used for the advancement of the American Development Model, particularly track and field tests and guidelines as part of both school education and high-performance athlete development for every age group from six to over 70 years old. These world’s best practices are to improve the country’s international performance while increasing national sport participation and positively influencing public health.Keywords: high performance, mass participation, sport development, track and field, USA
Procedia PDF Downloads 14419 Survival Analysis after a First Ischaemic Stroke Event: A Case-Control Study in the Adult Population of England.
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Stroke is associated with a significant risk of morbidity and mortality. There is scarcity of research on the long-term survival after first-ever ischaemic stroke (IS) events in England with regards to effects of different medical therapies and comorbidities. The objective of this study was to model the all-cause mortality after an IS diagnosis in the adult population of England. Using a retrospective case-control design, we extracted the electronic medical records of patients born prior to or in year 1960 in England with a first-ever ischaemic stroke diagnosis from January 1986 to January 2017 within the Health and Improvement Network (THIN) database. Participants with a history of ischaemic stroke were matched to 3 controls by sex and age at diagnosis and general practice. The primary outcome was the all-cause mortality. The hazards of the all-cause mortality were estimated using a Weibull-Cox survival model which included both scale and shape effects and a shared random effect of general practice. The model included sex, birth cohort, socio-economic status, comorbidities and medical therapies. 20,250 patients with a history of IS (cases) and 55,519 controls were followed up to 30 years. From 2008 to 2015, the one-year all-cause mortality for the IS patients declined with an absolute change of -0.5%. Preventive treatments to cases increased considerably over time. These included prescriptions of statins and antihypertensives. However, prescriptions for antiplatelet drugs decreased in the routine general practice since 2010. The survival model revealed a survival benefit of antiplatelet treatment to stroke survivors with hazard ratio (HR) of 0.92 (0.90 – 0.94). IS diagnosis had significant interactions with gender and age at entry and hypertension diagnosis. IS diagnosis was associated with high risk of all-cause mortality with HR= 3.39 (3.05-3.72) for cases compared to controls. Hypertension was associated with poor survival with HR = 4.79 (4.49 - 5.09) for hypertensive cases relative to non-hypertensive controls, though the detrimental effect of hypertension has not reached significance for hypertensive controls, HR = 1.19(0.82-1.56). This study of English primary care data showed that between 2008 and 2015, the rates of prescriptions of stroke preventive treatments increased, and a short-term all-cause mortality after IS stroke declined. However, stroke resulted in poor long-term survival. Hypertension, a modifiable risk factor, was found to be associated with poor survival outcomes in IS patients. Antiplatelet drugs were found to be protective to survival. Better efforts are required to reduce the burden of stroke through health service development and primary prevention.Keywords: general practice, hazard ratio, health improvement network (THIN), ischaemic stroke, multiple imputation, Weibull-Cox model.
Procedia PDF Downloads 18518 Roadmap to a Bottom-Up Approach Creating Meaningful Contributions to Surgery in Low-Income Settings
Authors: Eva Degraeuwe, Margo Vandenheede, Nicholas Rennie, Jolien Braem, Miryam Serry, Frederik Berrevoet, Piet Pattyn, Wouter Willaert, InciSioN Belgium Consortium
Abstract:
Background: Worldwide, five billion people lack access to safe and affordable surgical care. An added 1.27 million surgeons, anesthesiologists, and obstetricians (SAO) are needed by 2030 to meet the target of 20 per 100,000 population and to reach the goal of the Lancet Commission on Global Surgery. A well-informed future generation exposed early on to the current challenges in global surgery (GS) is necessary to ensure a sustainable future. Methods: InciSioN, the International Student Surgical Network, is a non-profit organization by and for students, residents, and fellows in over 80 countries. InciSioN Belgium, one of the prominent national working groups, has made a vast progression and collaborated with other networks to fill the educational gap, stimulate advocacy efforts and increase interactions with the international network. This report describes a roadmap to achieve sustainable development and education within GS, with the example of InciSioN Belgium. Results: Since the establishment of the organization’s branch in 2019, it has hosted an educational workshop for first-year residents in surgery, engaging over 2500 participants, and established a recurring directing board of 15 members. In the year 2020-2021, InciSioN Ghent has organized three workshops combining educational and interactive sessions for future prime advocates and surgical candidates. InciSioN Belgium has set up a strong formal coalition with the Belgian Medical Students’ Association (BeMSA), with its own standing committee, reaching over 3000+ medical students annually. In 2021-2022, InciSioN Belgium broadened to a multidisciplinary approach, including dentistry and nursing students and graduates within workshops and research projects, leading to a member and exposure increase of 450%. This roadmap sets strategic goals and mechanisms for the GS community to achieve nationwide sustained improvements in the research and education of GS focused on future SAOs, in order to achieve the GS sustainable development goals. In the coming year, expansion is directed to a formal integration of GS into the medical curriculum and increased international advocacy whilst inspiring SAOs to integrate into GS in Belgium. Conclusion: The development and implementation of durable change for GS are necessary. The student organization InciSioN Belgium is growing and hopes to close the colossal gap in GS and inspire the growth of other branches while sharing the know-how of a student organization.Keywords: advocacy, education, global surgery, InciSioN, student network
Procedia PDF Downloads 17417 Investigation of Cavitation in a Centrifugal Pump Using Synchronized Pump Head Measurements, Vibration Measurements and High-Speed Image Recording
Authors: Simon Caba, Raja Abou Ackl, Svend Rasmussen, Nicholas E. Pedersen
Abstract:
It is a challenge to directly monitor cavitation in a pump application during operation because of a lack of visual access to validate the presence of cavitation and its form of appearance. In this work, experimental investigations are carried out in an inline single-stage centrifugal pump with optical access. Hence, it gives the opportunity to enhance the value of CFD tools and standard cavitation measurements. Experiments are conducted using two impellers running in the same volute at 3000 rpm and the same flow rate. One of the impellers used is optimized for lower NPSH₃% by its blade design, whereas the other one is manufactured using a standard casting method. The cavitation is detected by pump performance measurements, vibration measurements and high-speed image recordings. The head drop and the pump casing vibration caused by cavitation are correlated with the visual appearance of the cavitation. The vibration data is recorded in an axial direction of the impeller using accelerometers recording at a sample rate of 131 kHz. The vibration frequency domain data (up to 20 kHz) and the time domain data are analyzed as well as the root mean square values. The high-speed recordings, focusing on the impeller suction side, are taken at 10,240 fps to provide insight into the flow patterns and the cavitation behavior in the rotating impeller. The videos are synchronized with the vibration time signals by a trigger signal. A clear correlation between cloud collapses and abrupt peaks in the vibration signal can be observed. The vibration peaks clearly indicate cavitation, especially at higher NPSHA values where the hydraulic performance is not affected. It is also observed that below a certain NPSHA value, the cavitation started in the inlet bend of the pump. Above this value, cavitation occurs exclusively on the impeller blades. The impeller optimized for NPSH₃% does show a lower NPSH₃% than the standard impeller, but the head drop starts at a higher NPSHA value and is more gradual. Instabilities in the head drop curve of the optimized impeller were observed in addition to a higher vibration level. Furthermore, the cavitation clouds on the suction side appear more unsteady when using the optimized impeller. The shape and location of the cavitation are compared to 3D fluid flow simulations. The simulation results are in good agreement with the experimental investigations. In conclusion, these investigations attempt to give a more holistic view on the appearance of cavitation by comparing the head drop, vibration spectral data, vibration time signals, image recordings and simulation results. Data indicates that a criterion for cavitation detection could be derived from the vibration time-domain measurements, which requires further investigation. Usually, spectral data is used to analyze cavitation, but these investigations indicate that the time domain could be more appropriate for some applications.Keywords: cavitation, centrifugal pump, head drop, high-speed image recordings, pump vibration
Procedia PDF Downloads 17916 Predicting OpenStreetMap Coverage by Means of Remote Sensing: The Case of Haiti
Authors: Ran Goldblatt, Nicholas Jones, Jennifer Mannix, Brad Bottoms
Abstract:
Accurate, complete, and up-to-date geospatial information is the foundation of successful disaster management. When the 2010 Haiti Earthquake struck, accurate and timely information on the distribution of critical infrastructure was essential for the disaster response community for effective search and rescue operations. Existing geospatial datasets such as Google Maps did not have comprehensive coverage of these features. In the days following the earthquake, many organizations released high-resolution satellite imagery, catalyzing a worldwide effort to map Haiti and support the recovery operations. Of these organizations, OpenStreetMap (OSM), a collaborative project to create a free editable map of the world, used the imagery to support volunteers to digitize roads, buildings, and other features, creating the most detailed map of Haiti in existence in just a few weeks. However, large portions of the island are still not fully covered by OSM. There is an increasing need for a tool to automatically identify which areas in Haiti, as well as in other countries vulnerable to disasters, that are not fully mapped. The objective of this project is to leverage different types of remote sensing measurements, together with machine learning approaches, in order to identify geographical areas where OSM coverage of building footprints is incomplete. Several remote sensing measures and derived products were assessed as potential predictors of OSM building footprints coverage, including: intensity of light emitted at night (based on VIIRS measurements), spectral indices derived from Sentinel-2 satellite (normalized difference vegetation index (NDVI), normalized difference built-up index (NDBI), soil-adjusted vegetation index (SAVI), urban index (UI)), surface texture (based on Sentinel-1 SAR measurements)), elevation and slope. Additional remote sensing derived products, such as Hansen Global Forest Change, DLR`s Global Urban Footprint (GUF), and World Settlement Footprint (WSF), were also evaluated as predictors, as well as OSM street and road network (including junctions). Using a supervised classification with a random forest classifier resulted in the prediction of 89% of the variation of OSM building footprint area in a given cell. These predictions allowed for the identification of cells that are predicted to be covered but are actually not mapped yet. With these results, this methodology could be adapted to any location to assist with preparing for future disastrous events and assure that essential geospatial information is available to support the response and recovery efforts during and following major disasters.Keywords: disaster management, Haiti, machine learning, OpenStreetMap, remote sensing
Procedia PDF Downloads 12515 Wrestling with Religion: A Theodramatic Exploration of Morality in Popular Culture
Authors: Nicholas Fieseler
Abstract:
The nature of religion implicit in popular culture is relevant both in and out of the university. The traditional rules-based conception of religion and the ethical systems that emerge from them do not necessarily convey the behavior of daily life as it exists apart from spaces deemed sacred. This paper proposes to examine the religion implicit in the popular culture phenomenon of professional wrestling and how that affects the understanding of popular religion. Pro wrestling, while frequently dismissed, offers a unique manner through which to re-examine religion in popular culture. A global phenomenon, pro wrestling occupies a distinct space in numerous countries and presents a legitimate reflection of human behavior cross-culturally on a scale few other phenomena can equal. Given its global viewership of millions, it should be recognized as a significant means of interpreting the human attraction to violence and its association with religion in general. Hans Urs von Balthasar’s theory of Theodrama will be used to interrogate the inchoate religion within pro wrestling. While Balthasar developed theodrama within the confines of Christian theology; theodrama contains remarkable versatility in its potential utility. Since theodrama re-envisions reality as drama, the actions of every human actor on the stage contributes to the play’s development, and all action contains some transcendent value. It is in this sense that even the “low brow” activity of pro wrestling may be understood in religious terms. Moreover, a pro wrestling storyline acts as a play within a play: the struggles in a pro wrestling match reflect the human attitudes toward life as it exists in the sacred and profane realms. The indistinct lines separating traditionally good (face) from traditionally bad (heel)wrestlers mirror the moral ambiguity in which many people interpret life. This blurred distinction between good and bad, and large segments of an audience’s embrace of the heel wrestlers, reveal ethical constraints that guide the everyday values of pro wrestling spectators, a moral ambivalence that is often overlooked by traditional religious systems, and which has hitherto been neglected in the academic literature on pro wrestling. The significance of interpreting the religion implicit in pro wrestling through a the dramatic lens extends beyond pro wrestling specifically and can examine the religion implicit in popular culture in general. The use of theodrama mitigates the rigid separation often ascribed to areas deemed sacred/ profane, ortranscendent / immanent, enabling a re-evaluation of religion and ethical systems as practiced in popular culture. The use of theodrama will be expressed by utilizing the pro wrestling match as a literary text that reflects the society from which it emerges. This analysis will also reveal the complex nature of religion in popular culture and provides new directions for the academic study of religion. This project consciously bridges the academic and popular realms. The goal of the research is not to add only to the academic literature on implicit religion in popular culture but to publish it in a form which speaks to those outside the standard academic audiences for such work.Keywords: ethics, popular religion, professional wrestling, theodrama
Procedia PDF Downloads 14114 Gauging Floral Resources for Pollinators Using High Resolution Drone Imagery
Authors: Nicholas Anderson, Steven Petersen, Tom Bates, Val Anderson
Abstract:
Under the multiple-use management regime established in the United States for federally owned lands, government agencies have come under pressure from commercial apiaries to grant permits for the summer pasturing of honeybees on government lands. Federal agencies have struggled to integrate honeybees into their management plans and have little information to make regulations that resolve how many colonies should be allowed in a single location and at what distance sets of hives should be placed. Many conservation groups have voiced their concerns regarding the introduction of honeybees to these natural lands, as they may outcompete and displace native pollinating species. Assessing the quality of an area in regard to its floral resources, pollen, and nectar can be important when attempting to create regulations for the integration of commercial honeybee operations into a native ecosystem. Areas with greater floral resources may be able to support larger numbers of honeybee colonies, while poorer resource areas may be less resilient to introduced disturbances. Attempts are made in this study to determine flower cover using high resolution drone imagery to help assess the floral resource availability to pollinators in high elevation, tall forb communities. This knowledge will help in determining the potential that different areas may have for honeybee pasturing and honey production. Roughly 700 images were captured at 23m above ground level using a drone equipped with a Sony QX1 RGB 20-megapixel camera. These images were stitched together using Pix4D, resulting in a 60m diameter high-resolution mosaic of a tall forb meadow. Using the program ENVI, a supervised maximum likelihood classification was conducted to calculate the percentage of total flower cover and flower cover by color (blue, white, and yellow). A complete vegetation inventory was taken on site, and the major flowers contributing to each color class were noted. An accuracy assessment was performed on the classification yielding an 89% overall accuracy and a Kappa Statistic of 0.855. With this level of accuracy, drones provide an affordable and time efficient method for the assessment of floral cover in large areas. The proximal step of this project will now be to determine the average pollen and nectar loads carried by each flower species. The addition of this knowledge will result in a quantifiable method of measuring pollen and nectar resources of entire landscapes. This information will not only help land managers determine stocking rates for honeybees on public lands but also has applications in the agricultural setting, aiding producers in the determination of the number of honeybee colonies necessary for proper pollination of fruit and nut crops.Keywords: honeybee, flower, pollinator, remote sensing
Procedia PDF Downloads 14013 Malaysian ESL Writing Process: A Comparison with England’s
Authors: Henry Nicholas Lee, George Thomas, Juliana Johari, Carmilla Freddie, Caroline Val Madin
Abstract:
Research in comparative and international education often provides value-laden views of an education system within and in between other countries. These views are frequently used by policy makers or educators to explore similarities and differences for, among others, benchmarking purposes. In this study, a comparison is made between Malaysia and England, focusing on the process of writing children went through to create a text, using a multimodal theoretical framework to analyse this comparison. The main purpose is political in nature as it served as an answer to Malaysia’s call for benchmarking of best practices for language learning. Furthermore, the focus on writing in this study adds into more empirical findings about early writers’ writing development and writing improvement, especially for children at the ages of 5-9. In research, comparative studies in English as a Second Language (ESL) writing pedagogy – particularly in Malaysia since the introduction of the Standard- based English Language Curriculum (KSSR) in 2011 as a draft and its full implementation in 2017; reviewed 2018 KSSR-CEFR aligned – has not been done comparatively. In theory, a multimodal theoretical framework somehow allows a logical comparison between first language and ESL which would provide useful insights to illuminate the writing process between Malaysia and England. The comparisons are not representative because of the different school systems in both countries. So far, the literature informs us that the curriculum for language learning is very much emphasised on children’s linguistic abilities, which include their proficiency and mastery of the language, its conventions, and technicalities. However, recent empirical findings suggested that literacy in its concepts and characters need change. In view of this suggestion, the comparison will look at how the process of writing is implemented through the five modes of communication: linguistic, visual, aural, spatial, and gestural. This project draws on data from Malaysia and England, involving 10 teachers, 26 classroom observations, 20 lesson plans, 20 interviews, and 20 brief conversations with teachers. The research focused upon 20 primary children of different genders aged 5-9, and in addition to primary data descriptions, 40 children’s works, 40 brief classroom conversations, 30 classroom photographs, and 30 school compound photographs were undertaken to investigate teachers and children’s use of modes and semiotic resources to design a text. The data were analysed by means of within-case analysis, cross-case analysis, and constant comparative analysis, with an initial stage of data categorisation, followed by general and specific coding, which clustered the data into thematic groups. The study highlights the importance of teachers’ and children’s engagement and interaction with various modes of communication, an adaptation from the English approaches to teaching writing within the KSSR framework and providing ‘voice’ to ESL writers to ensure that both have access to the knowledge and skills required to make decisions in developing multimodal texts and artefacts.Keywords: comparative education, early writers, KSSR, multimodal theoretical framework, writing development
Procedia PDF Downloads 6812 Life-Saving Design Strategies for Nursing Homes and Long-Term Care Facilities
Authors: Jason M. Hegenauer, Nicholas Fucci
Abstract:
In the late 1990s, a major deinstitutionalization movement of elderly patients took place, since which, the design of long-term care facilities has not been adequately analyzed in the United States. Over the course of the last 25 years, major innovations in construction methods, technology, and medicine have been developed, drastically changing the landscape of healthcare architecture. In light of recent events, and the expected increase in elderly populations with the aging of the baby-boomer generation, it is evident that reconsideration of these facilities is essential for the proper care of aging populations. The global response has been effective in stifling this pandemic; however, widespread disease still poses an imminent threat to the human race. Having witnessed the devastation Covid-19 has reaped throughout nursing homes and long-term care facilities, it is evident that the current strategies for protecting our most vulnerable populations are not enough. Light renovation of existing facilities and previously overlooked considerations for new construction projects can drastically lower the risk at nursing homes and long-term care facilities. A reconfigured entry sequence supplements several of the features which have been long-standing essentials of the design of these facilities. This research focuses on several aspects identified as needing improvement, including indoor environment quality, security measures incorporated into healthcare architecture and design, and architectural mitigation strategies for sick building syndrome. The results of this study have been compiled as 'best practices' for the design of future healthcare construction projects focused on the health, safety, and quality of life of the residents of these facilities. These design strategies, which can easily be implemented through renovation of existing facilities and new construction projects, minimize risk of infection and spread of disease while allowing routine functions to continue with minimal impact, should the need for future lockdowns arise. Through the current lockdown procedures, which were implemented during the Covid-19 pandemic, isolation of residents has caused great unrest and worry for family members and friends as they are cut off from their loved ones. At this time, data is still being reported, leaving infection and death rates inconclusive; however, recent projections in some states list long-term care facility deaths as high as 60% of all deaths in the state. The population of these facilities consists of residents who are elderly, immunocompromised, and have underlying chronic medical conditions. According to the Centers for Disease Control, these populations are particularly susceptible to infection and serious illness. The obligation to protect our most vulnerable population cannot be overlooked, and the harsh measures recently taken as a response to the Covid-19 pandemic prove that the design strategies currently utilized for doing so are inadequate.Keywords: building security, healthcare architecture and design, indoor environment quality, new construction, sick building syndrome, renovation
Procedia PDF Downloads 9711 Numerical Study of Leisure Home Chassis under Various Loads by Using Finite Element Analysis
Authors: Asem Alhnity, Nicholas Pickett
Abstract:
The leisure home industry is experiencing an increase in sales due to the rise in popularity of staycations. However, there is also a demand for improvements in thermal and structural behaviour from customers. Existing standards and codes of practice outline the requirements for leisure home design. However, there is a lack of expertise in applying Finite Element Analysis (FEA) to complex structures in this industry. As a result, manufacturers rely on standardized design approaches, which often lead to excessively engineered or inadequately designed products. This study aims to address this issue by investigating the impact of the habitation structure on chassis performance in leisure homes. The aim of this research is to comprehensively analyse the impact of the habitation structure on chassis performance in leisure homes. By employing FEA on the entire unit, including both the habitation structure and the chassis, this study seeks to develop a novel framework for designing and analysing leisure homes. The objectives include material reduction, enhancing structural stability, resolving existing design issues, and developing innovative modular and wooden chassis designs. The methodology used in this research is quantitative in nature. The study utilizes FEA to analyse the performance of leisure home chassis under various loads. The analysis procedures involve running the FEA simulations on the numerical model of the leisure home chassis. Different load scenarios are applied to assess the stress and deflection performance of the chassis under various conditions. FEA is a numerical method that allows for accurate analysis of complex systems. The research utilizes flexible mesh sizing to calculate small deflections around doors and windows, with large meshes used for macro deflections. This approach aims to minimize run-time while providing meaningful stresses and deflections. Moreover, it aims to investigate the limitations and drawbacks of the popular approach of applying FEA only to the chassis and replacing the habitation structure with a distributed load. The findings of this study indicate that the popular approach of applying FEA only to the chassis and replacing the habitation structure with a distributed load overlooks the strengthening generated from the habitation structure. By employing FEA on the entire unit, it is possible to optimize stress and deflection performance while achieving material reduction and enhanced structural stability. The study also introduces innovative modular and wooden chassis designs, which show promising weight reduction compared to the existing heavily fabricated lattice chassis. In conclusion, this research provides valuable insights into the impact of the habitation structure on chassis performance in leisure homes. By employing FEA on the entire unit, the study demonstrates the importance of considering the strengthening generated from the habitation structure in chassis design. The research findings contribute to advancements in material reduction, structural stability, and overall performance optimization. The novel framework developed in this study promotes sustainability, cost-efficiency, and innovation in leisure home design.Keywords: static homes, caravans, motor homes, holiday homes, finite element analysis (FEA)
Procedia PDF Downloads 10010 Chemical and Biomolecular Detection at a Polarizable Electrical Interface
Authors: Nicholas Mavrogiannis, Francesca Crivellari, Zachary Gagnon
Abstract:
Development of low-cost, rapid, sensitive and portable biosensing systems are important for the detection and prevention of disease in developing countries, biowarfare/antiterrorism applications, environmental monitoring, point-of-care diagnostic testing and for basic biological research. Currently, the most established commercially available and widespread assays for portable point of care detection and disease testing are paper-based dipstick and lateral flow test strips. These paper-based devices are often small, cheap and simple to operate. The last three decades in particular have seen an emergence in these assays in diagnostic settings for detection of pregnancy, HIV/AIDS, blood glucose, Influenza, urinary protein, cardiovascular disease, respiratory infections and blood chemistries. Such assays are widely available largely because they are inexpensive, lightweight, and portable, are simple to operate, and a few platforms are capable of multiplexed detection for a small number of sample targets. However, there is a critical need for sensitive, quantitative and multiplexed detection capabilities for point-of-care diagnostics and for the detection and prevention of disease in the developing world that cannot be satisfied by current state-of-the-art paper-based assays. For example, applications including the detection of cardiac and cancer biomarkers and biothreat applications require sensitive multiplexed detection of analytes in the nM and pM range, and cannot currently be satisfied with current inexpensive portable platforms due to their lack of sensitivity, quantitative capabilities and often unreliable performance. In this talk, inexpensive label-free biomolecular detection at liquid interfaces using a newly discovered electrokinetic phenomenon known as fluidic dielectrophoresis (fDEP) is demonstrated. The electrokinetic approach involves exploiting the electrical mismatches between two aqueous liquid streams forced to flow side-by-side in a microfluidic T-channel. In this system, one fluid stream is engineered to have a higher conductivity relative to its neighbor which has a higher permittivity. When a “low” frequency (< 1 MHz) alternating current (AC) electrical field is applied normal to this fluidic electrical interface the fluid stream with high conductivity displaces into the low conductive stream. Conversely, when a “high” frequency (20MHz) AC electric field is applied, the high permittivity stream deflects across the microfluidic channel. There is, however, a critical frequency sensitive to the electrical differences between each fluid phase – the fDEP crossover frequency – between these two events where no fluid deflection is observed, and the interface remains fixed when exposed to an external field. To perform biomolecular detection, two streams flow side-by-side in a microfluidic T-channel: one fluid stream with an analyte of choice and an adjacent stream with a specific receptor to the chosen target. The two fluid streams merge and the fDEP crossover frequency is measured at different axial positions down the resulting liquidKeywords: biodetection, fluidic dielectrophoresis, interfacial polarization, liquid interface
Procedia PDF Downloads 445