Search results for: damage measures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5929

Search results for: damage measures

1399 The Predictability of Three Implants to Support a Fixed Prosthesis in the Edentulous Mandible

Authors: M. Hirani, M. Devine, O. Obisesan, C. Bryant

Abstract:

Introduction: The use of four or more implants to support a fixed prosthesis in the edentulous mandible is well documented, with high levels of clinical outcomes recorded. Despite this, the use of three implant-supported fixed prostheses offers the potential to deliver a more cost-effective method of oral rehabilitation in the lower arch, an important consideration given that edentulism is most prevalent in low-income subpopulations. The purpose of this study aimed to evaluate the implant and prosthetic survival rate, changes in marginal bone level, and patient satisfaction associated with a three-implant-supported fixed prosthesis for rehabilitation of the edentulous mandible over a follow-up period of at least one year. Methods: A comprehensive literature search was performed to evaluate studies that met the selection criteria. The information extracted included the study design and population, participant demographics, observation period, loading protocol, and the number of implants placed together with the required outcome measures. Mean values and standard deviations (SD) were calculated using SPSS® (IBM Corporation, New York, USA), and the level of statistical significance across all comparative studies described was set at P < 0.05. Results: The eligible studies included a total of 1968 implants that were placed in 652 patients. The subjects ranged in age from 33-89 years, with a mean of 63.2 years. The mean cumulative implant and prosthetic survival rates were 95.5% and 96.2%, respectively, over a mean follow-up period of 3.25 years. The mean marginal bone loss recorded was 1.04 mm, and high patient satisfaction rates were reported across the studies. Conclusion: Current evidence suggests that a three implant-supported fixed prosthesis for the edentulous mandible is a successful treatment strategy presenting high implant and prosthetic survival rates over the short-to-medium term. Further well-designed controlled clinical trials are required to evaluate longer-term outcomes, with supplemental data correlating implant dimensions and prosthetic design.

Keywords: implants, mandible, fixed, prosthesis

Procedia PDF Downloads 131
1398 Experimental Study Analysis of Flow over Pickup Truck’s Cargo Area Using Bed Covers

Authors: Jonathan Rodriguez, Dominga Guerrero, Surupa Shaw

Abstract:

Automobiles are modeled in various forms, and they interact with air when in motion. Aerodynamics is the study of such interactions where solid bodies affect the way air moves around them. The shape of solid bodies can impact the ease at which they move against the flow of air; due to which any additional freightage, or loads, impact its aerodynamics. It is important to transport people and cargo safely. Despite the various safety measures, there are a large number of vehicle-related accidents. This study precisely explores the effects an automobile experiences, with added cargo and covers. The addition of these items changes the original vehicle shape and the approved design for safe driving. This paper showcases the effects of the changed vehicle shape and design via experimental testing conducted on a physical 1:27 scale and CAD model of an F-150 pickup truck, the most common pickup truck in the United States, with differently shaped loads and weight traveling at a constant speed. The additional freightage produces unwanted drag or lift resulting in lower fuel efficiencies and unsafe driving conditions. This study employs an adjustable external shell on the F-150 pickup truck to create a controlled aerodynamic geometry to combat the detrimental effects of additional freightage. The results utilize colored powder [ which acts as a visual medium for the interaction of air with the vehicle], to highlight the impact of the additional freight on the automobile’s external shell. This will be done along with simulation models using Altair CFD software of twelve cases regarding the effects of an added load onto an F-150 pickup truck. This paper is an attempt toward standardizing the geometric design of the external shell, given the uniqueness of every load and its placement on the vehicle; while providing real-time data to be compared to simulation results from the existing literature.

Keywords: aerodynamics, CFD, freightage, pickup cover

Procedia PDF Downloads 168
1397 Study of University Course Scheduling for Crowd Gathering Risk Prevention and Control in the Context of Routine Epidemic Prevention

Authors: Yuzhen Hu, Sirui Wang

Abstract:

As a training base for intellectual talents, universities have a large number of students. Teaching is a primary activity in universities, and during the teaching process, a large number of people gather both inside and outside the teaching buildings, posing a strong risk of close contact. The class schedule is the fundamental basis for teaching activities in universities and plays a crucial role in the management of teaching order. Different class schedules can lead to varying degrees of indoor gatherings and trajectories of class attendees. In recent years, highly contagious diseases have frequently occurred worldwide, and how to reduce the risk of infection has always been a hot issue related to public safety. "Reducing gatherings" is one of the core measures in epidemic prevention and control, and it can be controlled through scientific scheduling in specific environments. Therefore, the scientific prevention and control goal can be achieved by considering the reduction of the risk of excessive gathering of people during the course schedule arrangement. Firstly, we address the issue of personnel gathering in various pathways on campus, with the goal of minimizing congestion and maximizing teaching effectiveness, establishing a nonlinear mathematical model. Next, we design an improved genetic algorithm, incorporating real-time evacuation operations based on tracking search and multidimensional positive gradient cross-mutation operations, considering the characteristics of outdoor crowd evacuation. Finally, we apply undergraduate course data from a university in Harbin to conduct a case study. It compares and analyzes the effects of algorithm improvement and optimization of gathering situations and explores the impact of path blocking on the degree of gathering of individuals on other pathways.

Keywords: the university timetabling problem, risk prevention, genetic algorithm, risk control

Procedia PDF Downloads 88
1396 Accessing Livestock Depredation by the Himalayan Wolf in Neshyang Valley, Manag, Nepal

Authors: Tenzing Lama, Ganga Ram Regmi, Thakur Silwal, Rinzin Punjok Lama

Abstract:

Livestock depredation by a wolf and associated financial loss suffered by herders is perhaps the most important issue leading to human-wolf conflict. As a result, recolonized wolves remained one of the most persecuted large carnivores in Nepal Himalaya suffering high mortality due to retaliatory killings by herdsmen. Reducing such depredation are crucial in gaining herder’s support in conservation program to ensure the long-term survival of such carnivores. In February 2018, a study was conducted through questionnaire survey with 33 herders from different settlements in Neshyang valley of Manang district to assess the status of human-wolf conflict in terms of livestock loss and herder’s attitude. A total of 36 livestock were lost to the wolf with an average loss of 1.09 ± 0.48 (SE) livestock heads per herder between March 2017 to February 2018 which represents 1.5% of the total holdings. The estimated financial value of livestock loss was equivalent to US$ 25,428 with an average of US$ 770 per herder. Majority of the herders (80%) expressed a negative attitude towards the wolf, but only a few herders (6.06%) suggested removal of the wolf from the valley. The incidences of livestock loss differed significantly with highest in day time and seasonally highest in winter, when herders freely leaves their livestock (except goat/sheep) in the pastures. Wolf showed positive selectivity to the horse (EI=0.59), yak (EI=0.24) and cattle (EI=0.14) but strong avoidance to goat/sheep (EI=-1). This study suggests that livestock depredation by wolf could be minimized through improved livestock husbandry practices and implication of mitigation measures (e.g. coral improvement) and immediate relief to the victims. Conservation education and awareness programs to enhance herders knowledge about the ecological importance of wolf, provision of relief scheme and law enforcement.

Keywords: canis lupus canco, conservation education, human wildlife conflict, compensation schemes

Procedia PDF Downloads 16
1395 Assessment of Bisphenol A and 17 α-Ethinyl Estradiol Bioavailability in Soils Treated with Biosolids

Authors: I. Ahumada, L. Ascar, C. Pedraza, J. Montecino

Abstract:

It has been found that the addition of biosolids to soil is beneficial to soil health, enriching soil with essential nutrient elements. Although this sludge has properties that allow for the improvement of the physical features and productivity of agricultural and forest soils and the recovery of degraded soils, they also contain trace elements, organic trace and pathogens that can cause damage to the environment. The application of these biosolids to land without the total reclamation and the treated wastewater can transfer these compounds into terrestrial and aquatic environments, giving rise to potential accumulation in plants. The general aim of this study was to evaluate the bioavailability of bisphenol A (BPA), and 17 α-ethynyl estradiol (EE2) in a soil-biosolid system using wheat (Triticum aestivum) plant assays and a predictive extraction method using a solution of hydroxypropyl-β-cyclodextrin (HPCD) to determine if it is a reliable surrogate for this bioassay. Two soils were obtained from the central region of Chile (Lo Prado and Chicauma). Biosolids were obtained from a regional wastewater treatment plant. The soils were amended with biosolids at 90 Mg ha-1. Soils treated with biosolids, spiked with 10 mgkg-1 of the EE2 and 15 mgkg-1 and 30 mgkg-1of BPA were also included. The BPA, and EE2 concentration were determined in biosolids, soils and plant samples through ultrasound assisted extraction, solid phase extraction (SPE) and gas chromatography coupled to mass spectrometry determination (GC/MS). The bioavailable fraction found of each one of soils cultivated with wheat plants was compared with results obtained through a cyclodextrin biosimulator method. The total concentration found in biosolid from a treatment plant was 0.150 ± 0.064 mgkg-1 and 12.8±2.9 mgkg-1 of EE2 and BPA respectively. BPA and EE2 bioavailability is affected by the organic matter content and the physical and chemical properties of the soil. The bioavailability response of both compounds in the two soils varied with the EE2 and BPA concentration. It was observed in the case of EE2, the bioavailability in wheat plant crops contained higher concentrations in the roots than in the shoots. The concentration of EE2 increased with increasing biosolids rate. On the other hand, for BPA, a higher concentration was found in the shoot than the roots of the plants. The predictive capability the HPCD extraction was assessed using a simple linear correlation test, for both compounds in wheat plants. The correlation coefficients for the EE2 obtained from the HPCD extraction with those obtained from the wheat plants were r= 0.99 and p-value ≤ 0.05. On the other hand, in the case of BPA a correlation was not found. Therefore, the methodology was validated with respect to wheat plants bioassays, only in the EE2 case. Acknowledgments: The authors thank FONDECYT 1150502.

Keywords: emerging compounds, bioavailability, biosolids, endocrine disruptors

Procedia PDF Downloads 145
1394 Trends and Inequalities in Distance to and Use of Nearest Natural Space in the Context of the 20-Minute Neighbourhood: A 4-Wave National Repeat Crosssectional Study, 2013 to 2019

Authors: Jonathan R. Olsen, Natalie Nicholls, Jenna Panter, Hannah Burnett, Michael Tornow, Richard Mitchell

Abstract:

The 20-minute neighborhood is a policy priority for governments worldwide and a key feature of this policy is providing access to natural space within 800 meters of home. The study aims were to (1) examine the association between distance to nearest natural space and frequent use over time and (2) examine whether frequent use and changes in use were patterned by income and housing tenure over time. Bi-annual Scottish Household Survey data were obtained for 2013 to 2019 (n:42128 aged 16+). Adults were asked the walking distance to their nearest natural space, the frequency of visits to this space and their housing tenure, as well as age, sex and income. We examined the association between distance from home of nearest natural space, housing tenure, and the likelihood of frequent natural space use (visited once a week or more). Two-way interaction terms were further applied to explore variation in the association between tenure and frequent natural space use over time. We found that 87% of respondents lived within 10 minute walk of a natural space, meeting the policy specification for a 20-minute neighbourhood. Greater proximity to natural space was associated with increased use; individuals living a 6 to 10 minute walk and over 10 minute walk were respectively 53% and 78% less likely to report frequent natural space use than those living within a 5 minute walk. Housing tenure was an important predictor of frequent natural space use; private renters and homeowners were more likely to report frequent natural space use than social renters. Our findings provide evidence that proximity to natural space is a strong predictor of frequent use. Our study provides important evidence that time-based access measures alone do not consider deep-rooted socioeconomic variation in use of Natural space. Policy makers should ensure a nuanced lens is applied to operationalising and monitoring the 20-minute neighbourhood to safeguard against exacerbating existing inequalities.

Keywords: natural space, housing, inequalities, 20-minute neighbourhood, urban design

Procedia PDF Downloads 120
1393 An Inventory Management Model to Manage the Stock Level for Irregular Demand Items

Authors: Riccardo Patriarca, Giulio Di Gravio, Francesco Costantino, Massimo Tronci

Abstract:

An accurate inventory management policy acquires a crucial role in the several high-availability sectors. In these sectors, due to the high-cost of spares and backorders, an (S-1, S) replenishment policy is necessary for high-availability items. The policy enables the shipment of a substitute efficient item anytime the inventory size decreases by one. This policy can be modelled following the Multi-Echelon Technique for Recoverable Item Control (METRIC). The METRIC is a system-based technique that allows defining the optimum stock level in a multi-echelon network, adopting measures in line with the decision-maker’s perspective. The METRIC defines an availability-cost function with inventory costs and required service levels, using as inputs data about the demand trend, the supplying and maintenance characteristics of the network and the budget/availability constraints. The traditional METRIC relies on the hypothesis that a Poisson distribution well represents the demand distribution in case of items with a low failure rate. However, in this research, we will explore the effects of using a Poisson distribution to model the demand of low failure rate items characterized by an irregular demand trend. This characteristic of a demand is not included in the traditional METRIC formulation leading to the need of revising its traditional formulation. Using the CV (Coefficient of Variation) and ADI (Average inter-Demand Interval) classification, we will define the inherent flaws of Poisson-based METRIC for irregular demand items, defining an innovative ad hoc distribution which can better fit the irregular demands. This distribution will allow defining proper stock levels to reduce stocking and backorder costs due to the high irregularities in the demand trend. A case study in the aviation domain will clarify the benefits of this innovative METRIC approach.

Keywords: METRIC, inventory management, irregular demand, spare parts

Procedia PDF Downloads 347
1392 Correlates of Multiplicity of Risk Behavior among Injecting Drug Users in Three High HIV Prevalence States of India

Authors: Santosh Sharma

Abstract:

Background: Drug abuse, needle sharing, and risky sexual behaviour are often compounded to increase the risk of HIV transmission. Injecting Drug Users are at the duel risk of needle sharing and risky sexual Behaviour, becoming more vulnerable to STI and HIV. Thus, studying the interface of injecting drug use and risky sexual behaviour is important to curb the pace of HIV epidemic among IDUs. The aim of this study is to determine the factor associated with HIV among injecting drug users in three states of India. Materials and methods: This paper analyzes covariates of multiplicity of risk behavior among injecting drug users. Findings are based on data from Integrated Behavioral and Biological Assessment (IBBA) round 2, 2010. IBBA collects the information of IDUs from the six districts. IDUs were selected on the criteria of those who were 18 years or older, who injected addictive substances/drugs for non-medical purposes at least once in past six month. A total of 1,979 in round 2 were interviewed in the IBBA. The study employs quantitative techniques using standard statistical tools to achieve the above objectives. All results presented in this paper are unweighted univariate measures. Results: Among IDUs, average duration of injecting drugs is 5.2 years. Mean duration between first drug use to first injecting drugs among younger IDUs, belongs to 18-24 years is 2.6 years Needle cleaning practices is common with above two-fifths reporting its every time cleaning. Needle sharing is quite prevalent especially among younger IDUs. Further, IDUs practicing needle sharing exhibit pervasive multi-partner behavior. Condom use with commercial partners is almost 81 %, whereas with intimate partner it is 39 %. Coexistence of needle sharing and unprotected sex enhances STI prevalence (6.8 %), which is further pronounced among divorced/separated/widowed (9.4 %). Conclusion: Working towards risk reduction for IDUs must deal with multiplicity of risk. Interventions should deal with covariates of risk, addressing youth, and risky sexual behavior.

Keywords: IDUs, HIV, STI, behaviour

Procedia PDF Downloads 279
1391 A Novel Harmonic Compensation Algorithm for High Speed Drives

Authors: Lakdar Sadi-Haddad

Abstract:

The past few years study of very high speed electrical drives have seen a resurgence of interest. An inventory of the number of scientific papers and patents dealing with the subject makes it relevant. In fact democratization of magnetic bearing technology is at the origin of recent developments in high speed applications. These machines have as main advantage a much higher power density than the state of the art. Nevertheless particular attention should be paid to the design of the inverter as well as control and command. Surface mounted permanent magnet synchronous machine is the most appropriate technology to address high speed issues. However, it has the drawback of using a carbon sleeve to contain magnets that could tear because of the centrifugal forces generated in rotor periphery. Carbon fiber is well known for its mechanical properties but it has poor heat conduction. It results in a very bad evacuation of eddy current losses induce in the magnets by time and space stator harmonics. The three-phase inverter is the main harmonic source causing eddy currents in the magnets. In high speed applications such harmonics are harmful because on the one hand the characteristic impedance is very low and on the other hand the ratio between the switching frequency and that of the fundamental is much lower than that of the state of the art. To minimize the impact of these harmonics a first lever is to use strategy of modulation producing low harmonic distortion while the second is to introduce a sinus filter between the inverter and the machine to smooth voltage and current waveforms applied to the machine. Nevertheless, in very high speed machine the interaction of the processes mentioned above may introduce particular harmonics that can irreversibly damage the system: harmonics at the resonant frequency, harmonics at the shaft mode frequency, subharmonics etc. Some studies address these issues but treat these phenomena with separate solutions (specific strategy of modulation, active damping methods ...). The purpose of this paper is to present a complete new active harmonic compensation algorithm based on an improvement of the standard vector control as a global solution to all these issues. This presentation will be based on a complete theoretical analysis of the processes leading to the generation of such undesired harmonics. Then a state of the art of available solutions will be provided before developing the content of a new active harmonic compensation algorithm. The study will be completed by a validation study using simulations and practical case on a high speed machine.

Keywords: active harmonic compensation, eddy current losses, high speed machine

Procedia PDF Downloads 395
1390 Gis-Based Water Pollution Assesment of Buriganga River, Bangladesh

Authors: Nur-E-Jannat Tinu

Abstract:

Water is absolutely vital not only for the survival of human beings but also for plants, animals, and all other living organisms. Water bodies, such as lakes, rivers, ponds, and estuaries, are the source of water supply in domestic, industrial, agriculture, and aquaculture purposes. The Buriganga River flows through the south and west of Dhaka city. The water quality of this river has become a matter of concern due to anthropogenic intervention of vital pollutants such as industrial effluents, urban sewage, and solid wastes in this area. Buriganga River is at risk to contamination from untreated municipal wastes, industrial discharges, runoff from organic and inorganic fertilizers, pesticides, insecticides, and oil emission around the river. The residential and commercial establishments along the river discharge wastewater either directly into the river or through drains and canals into the river. However, several regulatory measures and policies have been enforced by the Government to protect the river Buriganga from pollution, in most cases to no affect. Water quality assessment reveals that the water is also not appropriate for irrigation purposes. The physical parameters (pH, TDS, EC, Temperature, DO, COD, BOD) indicated that the water is too poor to be useable for agricultural, drinking, or other purposes. Chemical concentrations showed significant seasonal variations with high-level concentrations during the monsoon season, presumably due to extreme seasonal surface runoff. A comparative study of Electrical Conductivity (EC) and Total Dissolved Solids (TDS) indicated a considerable increase over the last five years A change in trend was observed from 2020 June-July, probably due to monsoon and post-monsoon. EC values decreased from 775 to 665 mmho/cm during this period. DO increased significantly from the mid-post-monsoon months to the early monsoon period. The pH value of river water is strongly alkaline, ranging between 6.5 and 7.79. This indicates that ecological organic compounds cause the water to become alkaline after the monsoon and monsoon seasons. As the water pollution level is very high, an effective remediation and pollution control plan should be considered.

Keywords: precipitation, spatial distribution, effluent, remediation

Procedia PDF Downloads 140
1389 Evaluating the Effect of Spatial Qualities, Openness and Complexity, on Human Cognitive Performance within Virtual Reality

Authors: Pierre F. Gerard, Frederic F. Leymarie, William Latham

Abstract:

Architects have developed a series of objective evaluations, using spatial analysis tools such as Isovist, that show how certain spatial qualities are beneficial to specific human activities hosted in the built environments. In return, they can build more adapted environments by tuning those spatial qualities in their design. In parallel, virtual reality technologies have been developed by engineers with the dream of creating a system that immerses users in a new form of spatial experiences. They already have demonstrated a useful range of benefits not only in simulating critical events to assist people in acquiring new skills, but also to enhance memory retention, to name just a few. This paper investigates the effects of two spatial qualities, openness, and complexity, on cognitive performance within immersive virtual environments. Isovist measure is used to design a series of room settings with different levels of each spatial qualities. In an empirical study, each room was then used by every participant to solve a navigational puzzle game and give a rating of their spatial experience. They were then asked to fill in a questionnaire before solving the visual-spatial memory quiz, which addressed how well they remembered the different rooms. Findings suggest that those spatial qualities have an effect on some of the measures, including navigation performance and memory retention. In particular, there is an order effect for the navigation puzzle game. Participants tended to spend a longer time in the complex room settings. Moreover, there is an interaction effect while with more open settings, participants tended to perform better when in a simple setting; however, with more closed settings, participants tended to perform better in a more complex setting. For the visual-spatial memory quiz, participants performed significantly better within the more open rooms. We believe this is a first step in using virtual environments to enhance participant cognitive performances through better use of specific spatial qualities.

Keywords: architecture, navigation, spatial cognition, virtual reality

Procedia PDF Downloads 130
1388 Neuro-Epigenetic Changes on Diabetes Induced-Synaptic Fidelity in Brain

Authors: Valencia Fernandes, Dharmendra Kumar Khatri, Shashi Bala Singh

Abstract:

Background and Aim: Epigenetics are the inaudible signatures of several pathological processes in the brain. This study understands the influence of DNA methylation, a major epigenetic modification, in the prefrontal cortex and hippocampus of the diabetic brain and its notable effect on the cellular chaperones and synaptic proteins. Method: Chronic high fat diet and STZ-induced diabetic mice were studied for cognitive dysfunction, and global DNA methylation, as well as DNA methyltransferase (DNMT) activity, were assessed. Further, the cellular chaperones and synaptic proteins were examined using DNMT inhibitor, 5-aza-2′-deoxycytidine (5-aza-dC)-via intracerebroventricular injection. Moreover, % methylation of these synaptic proteins were also studied so as to correlate its epigenetic involvement. Computationally, its interaction with the DNMT enzyme were also studied using bioinformatic tools. Histological studies for morphological alterations and neuronal degeneration were also studied. Neurogenesis, a characteristic marker for new learning and memory formation, was also assessed via the BrdU staining. Finally, the most important behavioral studies, including the Morris water maze, Y maze, passive avoidance, and Novel object recognition test, were performed to study its cognitive functions. Results: Altered global DNA methylation and increased levels of DNMTs within the nucleus were confirmed in the cortex and hippocampus of the diseased mice, suggesting hypermethylation at a genetic level. Treatment with AzadC, a global DNA demethylating agent, ameliorated the protein and gene expression of the cellular chaperones and synaptic fidelity. Furthermore, the methylation analysis profile showed hypermethylation of the hsf1 protein, a master regulator for chaperones and thus, confirmed the epigenetic involvement in the diseased brain. Morphological improvements and decreased neurodegeneration, along with enhanced neurogenesis in the treatment group, suggest that epigenetic modulations do participate in learning and memory. This is supported by the improved behavioral test battery seen in the treatment group. Conclusion: DNA methylation could possibly accord in dysregulating the memory-associated proteins at chronic stages in type 2 diabetes. This could suggest a substantial contribution to the underlying pathophysiology of several metabolic syndromes like insulin resistance, obesity and also participate in transitioning this damage centrally, such as cognitive dysfunction.

Keywords: epigenetics, cognition, chaperones, DNA methylation

Procedia PDF Downloads 204
1387 Exploring the Impact of Cultural Values on the Performance of Women Bureaucrats in Pakistan

Authors: Fariya Tahreen

Abstract:

Women are an important part of the society comprising more than 50% population of the world. Participation of women in public services is increasing in the present era while cultural values embedded with gender differences still influencing the performance of working women. Many researches have been carried out on cultural impact on working women like managers, doctors, and lawyers and other public servants. But very rare efforts were made to study the impact of cultural values on the performance of women bureaucrats. The present study aimed to find out the relationship of cultural values (i.e., collective identity, gender segregation, and gender asymmetrical relations) with the performance of women bureaucrats. Sample of the present study comprised of 130 women bureaucrats from the Office Management Group, Inland Revenue, District Management Group, and Pakistan Police Services which is selected by convenient sampling technique. The locale of the study was Islamabad, Rawalpindi and Lahore city. The current research study was conducted by using a quantitative approach in research method and data were collected through survey method. The measures used in the study included: personal information, three main cultural values, and performance of women bureaucrats. Uni-variate and bi-variate analyses were implied by using correlation and multiple linear regression test. The current study shows a negative significant relationship between cultural values and performance of women bureaucrats (R²= 0.790, p-value 0.000). It shows that cultural values (collective identity, gender segregation and gender asymmetrical relations) significantly influence the performance of women bureaucrats. Due to the influence and pressure of these cultural values, women bureaucrats give less time to the office and avail more leaves. They also avoid contacting with male colleagues, public dealings, field visits and playing leadership role. Further, they attend fewer meetings of policy formulation due to given less importance for it. In a nutshell, the study concluded that cultural values significantly influence the performance of women bureaucrats in Pakistan.

Keywords: cultural values, performance, Pakistan, women bureaucrats

Procedia PDF Downloads 125
1386 Assessment of Biochemical Marker Profiles and Their Impact on Morbidity and Mortality of COVID-19 Patients in Tigray, Ethiopia

Authors: Teklay Gebrecherkos, Mahmud Abdulkadir

Abstract:

Abstract: The emergence and subsequent rapid worldwide spread of the COVID-19 pandemic have posed a global crisis, with a tremendously increasing burden of infection, morbidity, and mortality risks. Recent studies have suggested that severe cases of COVID-19 are characterized by massive biochemical, hematological, and inflammatory alterations whose synergistic effect is estimated to progress to multiple organ damage and failure. In this regard, biochemical monitoring of COVID-19 patients, based on comprehensive laboratory assessments and findings, is expected to play a crucial role in effective clinical management and improving the survival rates of patients. However, biochemical markers that can be informative of COVID-19 patient risk stratification and predictor of clinical outcomes are currently scarcely available. The study aims to investigate the profiles of common biochemical markers and their influence on the severity of the COVID-19 infection in Tigray, Ethiopia. Methods: A laboratory-based cross-sectional study was conducted from July to August 2020 at Quiha College of Engineering, Mekelle University COVID-19 isolation and treatment center. Sociodemographic and clinical data were collected using a structured questionnaire. Whole blood was collected from each study participant, and serum samples were separated after being delivered to the laboratory. Hematological biomarkers were analyzed using FACS count, while organ tests and serum electrolytes were analyzed using ion-selective electrode methods using a Cobas-6000 series machine. Data was analyzed using SPSS Vs 20. Results: A total of 120 SARS-CoV-2 patients were enrolled during the study. The participants ranged between 18 and 91 years, with a mean age of 52 (±108.8). The majority (40%) of participants were between the ages of 60 and above. Patients with multiple comorbidities developed severe COVID-19, though not statistically significant (p=0.34). Mann-Whitney U test analysis showed that biochemical tests such as neuropile count (p=0.003), AST levels (p=0.050), serum creatinine (p=0.000), and serum sodium (p=0.015) were significantly correlated with severe COVID-19 disease as compared to non-severe disease. Conclusion: The severity of COVID-19 was associated with higher age, organ tests AST and creatinine, serum Na+, and elevated total neutrophile count. Thus, further study needs to be conducted to evaluate the alterations of biochemical biomarkers and their impact on COVID-19.

Keywords: COVID-19, biomarkers, mortality, Tigray, Ethiopia

Procedia PDF Downloads 40
1385 Fires in Historic Buildings: Assessment of Evacuation of People by Computational Simulation

Authors: Ivana R. Moser, Joao C. Souza

Abstract:

Building fires are random phenomena that can be extremely violent, and safe evacuation of people is the most guaranteed tactic in saving lives. The correct evacuation of buildings, and other spaces occupied by people, means leaving the place in a short time and by the appropriate way. It depends on the perception of spaces by the individual, the architectural layout and the presence of appropriate routing systems. As historical buildings were constructed in other times, when, as in general, the current security requirements were not available yet, it is necessary to adapt these spaces to make them safe. Computer models of evacuation simulation are widely used tools for assessing the safety of people in a building or agglomeration sites and these are associated with the analysis of human behaviour, makes the results of emergency evacuation more correct and conclusive. The objective of this research is the performance evaluation of historical interest buildings, regarding the safe evacuation of people, through computer simulation, using PTV Viswalk software. The buildings objects of study are the Colégio Catarinense, centennial building, located in the city of Florianópolis, Santa Catarina / Brazil. The software used uses the variables of human behaviour, such as: avoid collision with other pedestrians and avoid obstacles. Scenarios were run on the three-dimensional models and the contribution to safety in risk situations was verified as an alternative measure, especially in the impossibility of applying those measures foreseen by the current fire safety codes in Brazil. The simulations verified the evacuation time in situations of normality and emergency situations, as well as indicate the bottlenecks and critical points of the studied buildings, to seek solutions to prevent and correct these undesirable events. It is understood that adopting an advanced computational performance-based approach promotes greater knowledge of the building and how people behave in these specific environments, in emergency situations.

Keywords: computer simulation, escape routes, fire safety, historic buildings, human behavior

Procedia PDF Downloads 187
1384 Climate Change and Dengue Transmission in Lahore, Pakistan

Authors: Sadia Imran, Zenab Naseem

Abstract:

Dengue fever is one of the most alarming mosquito-borne viral diseases. Dengue virus has been distributed over the years exponentially throughout the world be it tropical or sub-tropical regions of the world, particularly in the last ten years. Changing topography, climate change in terms of erratic seasonal trends, rainfall, untimely monsoon early or late and longer or shorter incidences of either summer or winter. Globalization, frequent travel throughout the world and viral evolution has lead to more severe forms of Dengue. Global incidence of dengue infections per year have ranged between 50 million and 200 million; however, recent estimates using cartographic approaches suggest this number is closer to almost 400 million. In recent years, Pakistan experienced a deadly outbreak of the disease. The reason could be that they have the maximum exposure outdoors. Public organizations have observed that changing climate, especially lower average summer temperature, and increased vegetation have created tropical-like conditions in the city, which are suitable for Dengue virus growth. We will conduct a time-series analysis to study the interrelationship between dengue incidence and diurnal ranges of temperature and humidity in Pakistan, Lahore being the main focus of our study. We have used annual data from 2005 to 2015. We have investigated the relationship between climatic variables and dengue incidence. We used time series analysis to describe temporal trends. The result shows rising trends of Dengue over the past 10 years along with the rise in temperature & rainfall in Lahore. Hence this seconds the popular statement that the world is suffering due to Climate change and Global warming at different levels. Disease outbreak is one of the most alarming indications of mankind heading towards destruction and we need to think of mitigating measures to control epidemic from spreading and enveloping the cities, countries and regions.

Keywords: Dengue, epidemic, globalization, climate change

Procedia PDF Downloads 233
1383 Identification of Blood Biomarkers Unveiling Early Alzheimer's Disease Diagnosis Through Single-Cell RNA Sequencing Data and Autoencoders

Authors: Hediyeh Talebi, Shokoofeh Ghiam, Changiz Eslahchi

Abstract:

Traditionally, Alzheimer’s disease research has focused on genes with significant fold changes, potentially neglecting subtle but biologically important alterations. Our study introduces an integrative approach that highlights genes crucial to underlying biological processes, regardless of their fold change magnitude. Alzheimer's Single-cell RNA-seq data related to the peripheral blood mononuclear cells (PBMC) was extracted from the Gene Expression Omnibus (GEO). After quality control, normalization, scaling, batch effect correction, and clustering, differentially expressed genes (DEGs) were identified with adjusted p-values less than 0.05. These DEGs were categorized based on cell-type, resulting in four datasets, each corresponding to a distinct cell type. To distinguish between cells from healthy individuals and those with Alzheimer's, an adversarial autoencoder with a classifier was employed. This allowed for the separation of healthy and diseased samples. To identify the most influential genes in this classification, the weight matrices in the network, which includes the encoder and classifier components, were multiplied, and focused on the top 20 genes. The analysis revealed that while some of these genes exhibit a high fold change, others do not. These genes, which may be overlooked by previous methods due to their low fold change, were shown to be significant in our study. The findings highlight the critical role of genes with subtle alterations in diagnosing Alzheimer's disease, a facet frequently overlooked by conventional methods. These genes demonstrate remarkable discriminatory power, underscoring the need to integrate biological relevance with statistical measures in gene prioritization. This integrative approach enhances our understanding of the molecular mechanisms in Alzheimer’s disease and provides a promising direction for identifying potential therapeutic targets.

Keywords: alzheimer's disease, single-cell RNA-seq, neural networks, blood biomarkers

Procedia PDF Downloads 66
1382 New Test Algorithm to Detect Acute and Chronic HIV Infection Using a 4th Generation Combo Test

Authors: Barun K. De

Abstract:

Acquired immunodeficiency syndrome (AIDS) is caused by two types of human immunodeficiency viruses, collectively designated HIV. HIV infection is spreading globally particularly in developing countries. Before an individual is diagnosed with HIV, the disease goes through different phases. First there is an acute early phase that is followed by an established or chronic phase. Subsequently, there is a latency period after which the individual becomes immunodeficient. It is in the acute phase that an individual is highly infectious due to a high viral load. Presently, HIV diagnosis involves use of tests that do not detect the acute phase infection during which both the viral RNA and p24 antigen are expressed. Instead, these less sensitive tests detect antibodies to viral antigens which are typically sero-converted later in the disease process following acute infection. These antibodies are detected in both asymptomatic HIV-infected individuals as well as AIDS patients. Studies indicate that early diagnosis and treatment of HIV infection can reduce medical costs, improve survival, and reduce spreading of infection to new uninfected partners. Newer 4th generation combination antigen/antibody tests are highly sensitive and specific for detection of acute and established HIV infection (HIV1 and HIV2) enabling immediate linkage to care. The CDC (Center of Disease Control, USA) recently recommended an algorithm involving three different tests to screen and diagnose acute and established infections of HIV-1 and HIV-2 in a general population. Initially a 4th generation combo test detects a viral antigen p24 and specific antibodies against HIV -1 and HIV-2 envelope proteins. If the test is positive it is followed by a second test known as a differentiation assay which detects antibodies against specific HIV-1 and HIV-2 envelope proteins confirming established infection of HIV-1 or HIV-2. However if it is negative then another test is performed that measures viral load confirming an acute HIV-1 infection. Screening results of a Phoenix area population detected 0.3% new HIV infections among which 32.4% were acute cases. Studies in the U.S. indicate that this algorithm effectively reduces HIV infection through immediate treatment and education following diagnosis.

Keywords: new algorithm, HIV, diagnosis, infection

Procedia PDF Downloads 410
1381 Investigation of the Effects of 10-Week Nordic Hamstring Exercise Training and Subsequent Detraining on Plasma Viscosity and Oxidative Stress Levels in Healthy Young Men

Authors: H. C. Ozdamar , O. Kilic-Erkek, H. E. Akkaya, E. Kilic-Toprak, M. Bor-Kucukatay

Abstract:

Nordic hamstring exercise (NHE) is used to increase hamstring muscle strength, prevent injuries. The aim of this study was to reveal the acute, long-term effects of 10-week NHE, followed by 5, 10-week detraining on anthropometric measurements, flexibility, anaerobic power, muscle architecture, damage, fatigue, oxidative stress, plasma viscosity (PV), blood lactate levels. 40 sedentary, healthy male volunteers underwent 10 weeks of progressive NHE followed by 5, 10 weeks of detraining. Muscle architecture was determined by ultrasonography, stiffness by strain elastography. Anaerobic power was assessed by double-foot standing, long jump, vertical jump, flexibility by sit-lie, hamstring flexibility tests. Creatine kinase activity, oxidant/antioxidant parameters were measured from venous blood by a commercial kit, whereas PV was determined using a cone-plate viscometer. The blood lactate level was measured from the fingertip. NHE allowed subjects to lose weight, this effect was reversed by detraining for 5 weeks. Exercise caused an increase in knee angles measured by a goniometer, which wasn’t affected by detraining. 10-week NHE caused a partially reversed increase in anaerobic performance upon detraining. NHE resulted in increment of biceps femoris long head (BFub) area, pennation angle, which was reversed by detraining of 10-weeks. Blood lactate levels, muscle pain, fatigue were increased after each exercise session. NHE didn’t change oxidant/antioxidant parameters; 5-week detraining resulted in an increase in total oxidant capacity (TOC) and oxidative stress index (OSI). Detraining of 10 weeks caused a reduction of these parameters. Acute exercise caused a reduction in PV at 1 to 10 weeks. Pre-exercise PV measured on the 10th week was lower than the basal value. Detraining caused the increment of PV. The results may guide the selection of the exercise type to increase performance and muscle strength. Knowing how much of the gains will be lost after a period of detraining can contribute to raising awareness of the continuity of the exercise. This work was supported by PAU Scientific Research Projects Coordination Unit (Project number: 2018SABE034)

Keywords: anaerobic power, detraining, Nordic hamstring exercise, oxidative stress, plasma viscosity

Procedia PDF Downloads 126
1380 The Curse of Oil: Unpacking the Challenges to Food Security in the Nigeria's Niger Delta

Authors: Abosede Omowumi Babatunde

Abstract:

While the Niger Delta region satisfies the global thirst for oil, the inhabitants have not been adequately compensated for the use of their ancestral land. Besides, the ruthless exploitation and destruction of the natural environment upon which the inhabitants of the Niger Delta depend for their livelihood and sustenance by the activities of oil multinationals, pose major threats to food security in the region and by implication, Nigeria in general, Africa, and the world, given the present global emphasis on food security. This paper examines the effect of oil exploitation on household food security, identify key gaps in measures put in place to address the changes to livelihoods and food security and explore what should be done to improve the local people access to sufficient, safe and culturally acceptable food in the Niger Delta. Data is derived through interviews with key informants and Focus Group Discussions (FGDs) conducted with respondents in the local communities in the Niger Delta states of Delta, Bayelsa and Rivers as well as relevant extant studies. The threat to food security is one important aspect of the human security challenges in the Niger Delta which has received limited scholarly attention. In addition, successive Nigerian governments have not meaningfully addressed the negative impacts of oil-induced environmental degradation on traditional livelihoods given the significant linkages between environmental sustainability, livelihood security, and food security. The destructive impact of oil pollution on the farmlands, crops, economic trees, creeks, lakes, and fishing equipment is so devastating that the people can no longer engage in productive farming and fishing. Also important is the limited access to modern agricultural methods for fishing and subsistence farming as fishing and farming are done using mostly crude implements and traditional methods. It is imperative and urgent to take stock of the negative implications of the activities of oil multinationals for environmental and livelihood sustainability, and household food security in the Niger Delta.

Keywords: challenges, food security, Nigeria's Niger delta, oil

Procedia PDF Downloads 249
1379 Real-Time Hybrid Simulation for a Tuned Liquid Column Damper Implementation

Authors: Carlos Riascos, Peter Thomson

Abstract:

Real-time hybrid simulation (RTHS) is a modern cyber-physical technique used for the experimental evaluation of complex systems, that treats the system components with predictable behavior as a numerical substructure and the components that are difficult to model as an experimental substructure. Therefore it is an attractive method for evaluation of the response of civil structures under earthquake, wind and anthropic loads. Another practical application of RTHS is the evaluation of control systems, as these devices are often nonlinear and their characterization is an important step in the design of controllers with the desired performance. In this paper, the response of three-story shear frame controlled by a tuned liquid column damper (TLCD) and subject to base excitation is considered. Both passive and semi-active control strategies were implemented and are compared. While the passive TLCD achieved a reduction of 50% in the acceleration response of the main structure in comparison with the structure without control, the semi-active TLCD achieved a reduction of 70%, and was robust to variations in the dynamic properties of the main structure. In addition, a RTHS was implemented with the main structure modeled as a linear, time-invariant (LTI) system through a state space representation and the TLCD, with both control strategies, was evaluated on a shake table that reproduced the displacement of the virtual structure. Current assessment measures for RTHS were used to quantify the performance with parameters such as generalized amplitude, equivalent time delay between the target and measured displacement of the shake table, and energy error using the measured force, and prove that the RTHS described in this paper is an accurate method for the experimental evaluation of structural control systems.

Keywords: structural control, hybrid simulation, tuned liquid column damper, semi-active sontrol strategy

Procedia PDF Downloads 297
1378 Using of the Fractal Dimensions for the Analysis of Hyperkinetic Movements in the Parkinson's Disease

Authors: Sadegh Marzban, Mohamad Sobhan Sheikh Andalibi, Farnaz Ghassemi, Farzad Towhidkhah

Abstract:

Parkinson's disease (PD), which is characterized by the tremor at rest, rigidity, akinesia or bradykinesia and postural instability, affects the quality of life of involved individuals. The concept of a fractal is most often associated with irregular geometric objects that display self-similarity. Fractal dimension (FD) can be used to quantify the complexity and the self-similarity of an object such as tremor. In this work, we are aimed to propose a new method for evaluating hyperkinetic movements such as tremor, by using the FD and other correlated parameters in patients who are suffered from PD. In this study, we used 'the tremor data of Physionet'. The database consists of fourteen participants, diagnosed with PD including six patients with high amplitude tremor and eight patients with low amplitude. We tried to extract features from data, which can distinguish between patients before and after medication. We have selected fractal dimensions, including correlation dimension, box dimension, and information dimension. Lilliefors test has been used for normality test. Paired t-test or Wilcoxon signed rank test were also done to find differences between patients before and after medication, depending on whether the normality is detected or not. In addition, two-way ANOVA was used to investigate the possible association between the therapeutic effects and features extracted from the tremor. Just one of the extracted features showed significant differences between patients before and after medication. According to the results, correlation dimension was significantly different before and after the patient's medication (p=0.009). Also, two-way ANOVA demonstrates significant differences just in medication effect (p=0.033), and no significant differences were found between subject's differences (p=0.34) and interaction (p=0.97). The most striking result emerged from the data is that correlation dimension could quantify medication treatment based on tremor. This study has provided a technique to evaluate a non-linear measure for quantifying medication, nominally the correlation dimension. Furthermore, this study supports the idea that fractal dimension analysis yields additional information compared with conventional spectral measures in the detection of poor prognosis patients.

Keywords: correlation dimension, non-linear measure, Parkinson’s disease, tremor

Procedia PDF Downloads 244
1377 Description of the Process Which Determine the Criterion Validity of Semi-Structured Interview PARA-SCI.CZ

Authors: Jarmila Štěpánová, Martin Kudláček, Lukáš Jakubec

Abstract:

The people with spinal cord injury are one of the least sport active members of our society. Their hypoactivity is determined by primary injury, i.e., the loss of motor function, the injured part of the body is connected with health complications and social handicap. Study performs one part of the standardization process of semi-structured interview PARA-SCI.CZ (Czech version of the Physical Activity Recall Assessment for People with Spinal Cord Injury), which measures the type, frequency, duration, and intensity of physical activity of people with spinal cord injury. The study focused on persons with paraplegia who use a wheelchair as their primary mode of mobility. The aim of this study was to perform a process to determine the criterion validity of PARA-SCI.CZ. The actual physical activity of wheelchair users was monitored during three days by using accelerometers Actigraph GT3X fixed on the non-dominant wrist, and semi-structured interview PARA-SCI.CZ. During the PARA-SCI.CZ interview, participants were asked to recall activities they had done over the past 3 days, starting with the previous day. PARA-SCI.CZ captured frequency, duration, and intensity (low, moderate, and heavy) of two categories of physical activity (leisure time physical activity and activities of a usual day). Accelerometer Actigraph GT3X captured duration and intensity (low and moderate + heavy) of physical activity during three days and nights. The study presented three potential recalculations of measured data. Standardization process of PARA-SCI.CZ is essential to critically approach issues of health and active lifestyle of persons with spinal cord injury in the Czech Republic. Standardized PARA-SCI.CZ can be used in practice by physiotherapists and sports pedagogues from the field of adapted physical activities.

Keywords: physical activity, lifestyle, paraplegia, semi-structure interview, accelerometer

Procedia PDF Downloads 325
1376 Community Arts-Based Learning for Interdisciplinary Pedagogy: Measuring Program Effectiveness Using Design Imperatives for 'a New American University'

Authors: Kevin R. Wilson, Roger Mantie

Abstract:

Community arts-based learning and participatory education are pedagogical techniques that serve to be advantageous for students, curriculum development, and local communities. Using an interpretive approach to examine the significance of this arts-informed research in relation to the eight ‘design imperatives’ proposed as the new model for measuring quality in scholarship for Arizona State University as ‘A New American University’, the purpose of this study was to investigate personal, social, and cultural benefits resulting from student engagement in interdisciplinary community-based projects. Students from a graduate level music education class at the ASU Tempe campus (n=7) teamed with students from an undergraduate level community development class at the ASU Downtown Phoenix campus (n=14) to plan, facilitate, and evaluate seven community-based projects in several locations around the Phoenix-metro area. Data was collected using photo evidence, student reports, and evaluative measures designed by the students. The effectiveness of each project was measured in terms of their ability to meet the eight design imperatives to: 1) leverage place; 2) transform society; 3) value entrepreneurship; 4) conduct use-inspired research; 5) enable student success; 6) fuse intellectual disciplines; 7) be socially embedded; and 8) engage globally. Results indicated that this community arts-based project sufficiently captured the essence of each of these eight imperatives. Implications for how the nature of this interdisciplinary initiative allowed for the eight imperatives to manifest are provided, and project success is expounded upon in relation to utility of each imperative. Discussion is also given for how this type of service learning project formatted within the ‘New American University’ model for measuring quality in academia can be a beneficial pedagogical tool in higher education.

Keywords: community arts-based learning, participatory education, pedagogy, service learning

Procedia PDF Downloads 401
1375 Reviewing Special Education Preservice Teachers' Reflective Practices over Two Field Experiences: Topics and Changes in Reflection

Authors: Laurie U. deBettencourt

Abstract:

During pre-service field experiences teacher candidates are often asked to reflect as part of their training and in this investigation candidates’ reflective journal entries were reviewed, coded and analyzed with results suggesting teacher candidates need more direct instruction on how to describe, analyze, and make judgements on their instructional practices so that their practices improve over time. Teacher education programs often incorporate reflective-based activities during field experiences. The purpose of this investigation was to determine if special education teacher candidate’s reflective practices changed as they completed their two supervised field experiences and to determine what topics the candidates focused on in their reflections. The six females graduate students were completing two field experiences in special education classrooms within one academic year as part of their coursework leading to a master’s degree and special education teacher state certification. Each candidate wrote 15 reflection journal entries (approximately 200 words each) per field experience. Each of the journal entries were reviewed sentence by sentence to determine a reflective practice score and to determine the topics discussed. The reflective practice score was calculated using four dimensions of reflection (describe, analyze, judge, and apply) in order to create a continuous variable representing their reflective practice across four points of time. A One-way Repeated Measures Analysis of Variance (ANOVA) suggested that special education teacher candidates did not change their reflective practices over time (i.e., at time-point one the practitioner’s mean score was 56.0 out of 100 (SD = 7.6), 53.8 (SD = 4.3) at time-point two, 51.2 (SD = 4.5) at time-point three, and 57.7 (SD = 8.2) at time-point four). Qualitative findings suggest candidates focused mostly on themselves in their reflections. Conclusions suggest the need for teacher preparation programs to provide more direct instruction on how a teacher should reflect. Specific implications are provided for teacher training and future research.

Keywords: field experiences, reflective practices, special educators, teacher preparation

Procedia PDF Downloads 350
1374 Monitoring the Thin Film Formation of Carrageenan and PNIPAm Microgels

Authors: Selim Kara, Ertan Arda, Fahrettin Dolastir, Önder Pekcan

Abstract:

Biomaterials and thin film coatings play a fundamental role in medical, food and pharmaceutical industries. Carrageenan is a linear sulfated polysaccharide extracted from algae and seaweeds. To date, such biomaterials have been used in many smart drug delivery systems due to their biocompatibility and antimicrobial activity properties. Poly (N-isopropylacrylamide) (PNIPAm) gels and copolymers have also been used in medical applications. PNIPAm shows lower critical solution temperature (LCST) property at about 32-34 °C which is very close to the human body temperature. Below and above the LCST point, PNIPAm gels exhibit distinct phase transitions between swollen and collapsed states. A special class of gels are microgels which can react to environmental changes significantly faster than microgels due to their small sizes. Quartz crystal microbalance (QCM) measurement technique is one of the attractive techniques which has been used for monitoring the thin-film formation process. A sensitive QCM system was designed as to detect 0.1 Hz difference in resonance frequency and 10-7 change in energy dissipation values, which are the measures of the deposited mass and the film rigidity, respectively. PNIPAm microgels with the diameter around few hundred nanometers in water were produced via precipitation polymerization process. 5 MHz quartz crystals with functionalized gold surfaces were used for the deposition of the carrageenan molecules and microgels in the solutions which were slowly pumped through a flow cell. Interactions between charged carrageenan and microgel particles were monitored during the formation of the film layers, and the Sauerbrey masses of the deposited films were calculated. The critical phase transition temperatures around the LCST were detected during the heating and cooling cycles. It was shown that it is possible to monitor the interactions between PNIPAm microgels and biopolymer molecules, and it is also possible to specify the critical phase transition temperatures by using a QCM system.

Keywords: carrageenan, phase transitions, PNIPAm microgels, quartz crystal microbalance (QCM)

Procedia PDF Downloads 231
1373 Contribution of PALB2 and BLM Mutations to Familial Breast Cancer Risk in BRCA1/2 Negative South African Breast Cancer Patients Detected Using High-Resolution Melting Analysis

Authors: N. C. van der Merwe, J. Oosthuizen, M. F. Makhetha, J. Adams, B. K. Dajee, S-R. Schneider

Abstract:

Women representing high-risk breast cancer families, who tested negative for pathogenic mutations in BRCA1 and BRCA2, are four times more likely to develop breast cancer compared to women in the general population. Sequencing of genes involved in genomic stability and DNA repair led to the identification of novel contributors to familial breast cancer risk. These include BLM and PALB2. Bloom's syndrome is a rare homozygous autosomal recessive chromosomal instability disorder with a high incidence of various types of neoplasia and is associated with breast cancer when in a heterozygous state. PALB2, on the other hand, binds to BRCA2 and together, they partake actively in DNA damage repair. Archived DNA samples of 66 BRCA1/2 negative high-risk breast cancer patients were retrospectively selected based on the presence of an extensive family history of the disease ( > 3 affecteds per family). All coding regions and splice-site boundaries of both genes were screened using High-Resolution Melting Analysis. Samples exhibiting variation were bi-directionally automated Sanger sequenced. The clinical significance of each variant was assessed using various in silico and splice site prediction algorithms. Comprehensive screening identified a total of 11 BLM and 26 PALB2 variants. The variants detected ranged from global to rare and included three novel mutations. Three BLM and two PALB2 likely pathogenic mutations were identified that could account for the disease in these extensive breast cancer families in the absence of BRCA mutations (BLM c.11T > A, p.V4D; BLM c.2603C > T, p.P868L; BLM c.3961G > A, p.V1321I; PALB2 c.421C > T, p.Gln141Ter; PALB2 c.508A > T, p.Arg170Ter). Conclusion: The study confirmed the contribution of pathogenic mutations in BLM and PALB2 to the familial breast cancer burden in South Africa. It explained the presence of the disease in 7.5% of the BRCA1/2 negative families with an extensive family history of breast cancer. Segregation analysis will be performed to confirm the clinical impact of these mutations for each of these families. These results justify the inclusion of both these genes in a comprehensive breast and ovarian next generation sequencing cancer panel and should be screened simultaneously with BRCA1 and BRCA2 as it might explain a significant percentage of familial breast and ovarian cancer in South Africa.

Keywords: Bloom Syndrome, familial breast cancer, PALB2, South Africa

Procedia PDF Downloads 236
1372 Maintenance Performance Measurement Derived Optimization: A Case Study

Authors: James M. Wakiru, Liliane Pintelon, Peter Muchiri, Stanley Mburu

Abstract:

Maintenance performance measurement (MPM) represents an integrated aspect that considers both operational and maintenance related aspects while evaluating the effectiveness and efficiency of maintenance to ensure assets are working as they should. Three salient issues require to be addressed for an asset-intensive organization to employ an MPM-based framework to optimize maintenance. Firstly, the organization should establish important perfomance metric(s), in this case the maintenance objective(s), which they will be focuss on. The second issue entails aligning the maintenance objective(s) with maintenance optimization. This is achieved by deriving maintenance performance indicators that subsequently form an objective function for the optimization program. Lastly, the objective function is employed in an optimization program to derive maintenance decision support. In this study, we develop a framework that initially identifies the crucial maintenance performance measures, and employs them to derive maintenance decision support. The proposed framework is demonstrated in a case study of a geothermal drilling rig, where the objective function is evaluated utilizing a simulation-based model whose parameters are derived from empirical maintenance data. Availability, reliability and maintenance inventory are depicted as essential objectives requiring further attention. A simulation model is developed mimicking a drilling rig operations and maintenance where the sub-systems are modelled undergoing imperfect maintenance, corrective (CM) and preventive (PM), with the total cost as the primary performance measurement. Moreover, three maintenance spare inventory policies are considered; classical (retaining stocks for a contractual period), vendor-managed inventory with consignment stock and periodic monitoring order-to-stock (s, S) policy. Optimization results infer that the adoption of (s, S) inventory policy, increased PM interval and reduced reliance of CM actions offers improved availability and total costs reduction.

Keywords: maintenance, vendor-managed, decision support, performance, optimization

Procedia PDF Downloads 125
1371 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging

Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen

Abstract:

Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.

Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques

Procedia PDF Downloads 99
1370 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 194