Search results for: Michael San Francisco
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 827

Search results for: Michael San Francisco

197 Discussion of Blackness in Wrestling

Authors: Jason Michael Crozier

Abstract:

The wrestling territories of the mid-twentieth century in the United States are widely considered the birthplace of modern professional wrestling, and by many professional wrestlers, to be a beacon of hope for the easing of racial tensions during the civil rights era and beyond. The performers writing on this period speak of racial equality but fail to acknowledge the exploitation of black athletes as a racialized capital commodity who suffered the challenges of systemic racism, codified by a false narrative of aspirational exceptionalism and equality measured by audience diversity. The promoters’ ability to equate racial and capital exploitation with equality leads to a broader discussion of the history of Muscular Christianity in the United States and the exploitation of black bodies. Narratives of racial erasure that dominate the historical discourse when examining athleticism and exceptionalism redefined how blackness existed and how physicality and race are conceived of in sport and entertainment spaces. When discussing the implications of race and professional wrestling, it is important to examine the role of promotions as ‘imagined communities’ where the social agency of wrestlers is defined and quantified based on their ‘desired elements’ as a performer. The intentionally vague nature of this language masks a deep history of racialization that has been perpetuated by promoters and never fully examined by scholars. Sympathetic racism and the omission of cultural identity are also key factors in the limitations and racial barriers placed upon black athletes in the squared circle. The use of sympathetic racism within professional wrestling during the twentieth century defined black athletes into two distinct categorizations, the ‘black savage’ or the ‘black minstrel’. Black wrestlers of the twentieth century were defined by their strength as a capital commodity and their physicality rather than their knowledge of the business and in-ring skill. These performers had little agency in their ability to shape their own character development inside and outside the ring. Promoters would often create personas that heavily racialized the performer by tying them to a regional past or memory, such as that of slavery in the deep south using dog collar matches and adoring black characters in chains. Promoters softened cultural memory by satirizing the historic legacy of slavery and the black identity.

Keywords: sympathetic racism, social agency, racial commodification, stereotyping

Procedia PDF Downloads 126
196 Re-Evaluating the Hegemony of English Language in West Africa: A Meta-Analysis Review of the Research, 2003-2018

Authors: Oris Tom-Lawyer, Michael Thomas

Abstract:

This paper seeks to analyse the hegemony of the English language in Western Africa through the lens of educational policies and the socio-economic functions of the language. It is based on the premise that there is a positive link between the English language and development contexts. The study aims to fill a gap in the research literature by examining the usefulness of hegemony as a concept to explain the role of English language in the region, thus countering the negative connotations that often accompany it. The study identified four main research questions: i. What are the socio-economic functions of English in Francophone/lusophone countries? ii. What factors promote the hegemony of English in anglophone countries? iii. To what extent is the hegemony of English in West Africa? iv. What are the implications of the non-hegemony of English in Western Africa? Based on a meta-analysis of the research literature between 2003 and 2018, the findings of the study revealed that in francophone/lusophone countries, English functions in the following socio-economic domains; they are peace keeping missions, regional organisations, commercial and industrial sectors, as an unofficial international language and as a foreign language. The factors that promote linguistic hegemony of English in anglophone countries are English as an official language, a medium of instruction, lingua franca, cultural language, language of politics, language of commerce, channel of development and English for media and entertainment. In addition, the extent of the hegemony of English in West Africa can be viewed from the factors that contribute to the non-hegemony of English in the region; they are French language, Portuguese language, the French culture, neo-colonialism, level of poverty, and economic ties of French to its former colonies. Finally, the implications of the non-hegemony of English language in West Africa are industrial backwardness, poverty rate, lack of social mobility, drop out of school rate, growing interest in English, access to limited internet information and lack of extensive career opportunities. The paper concludes that the hegemony of English has resulted in the development of anglophone countries in Western Africa, while in the francophone/lusophone regions of the continent, industrial backwardness and low literacy rates have been consequences of English language marginalisation. In conclusion, the paper makes several recommendations, including the need for the early introduction of English into French curricula as part of a potential solution.

Keywords: developmental tool, English language, linguistic hegemony, West Africa

Procedia PDF Downloads 136
195 Identifying the Determinants of Compliance with Maritime Environmental Legislation in the North and Baltic Sea Area: A Model Developed from Exploratory Qualitative Data Collection

Authors: Thea Freese, Michael Gille, Andrew Hursthouse, John Struthers

Abstract:

Ship operators on the North and Baltic Sea have been experiencing increased political interest in marine environmental protection and cleaner vessel operations. Stricter legislation on SO2 and NOx emissions, ballast water management and other measures of protection are currently being phased in or will come into force in the coming years. These measures benefit the health of the marine environment, while increasing company’s operational costs. In times of excess shipping capacity and linked consolidation in the industry non-compliance with environmental rules is one way companies might hope to stay competitive with both intra- and inter-modal trade. Around 5-15% of industry participants are believed to neglect laws on vessel-source pollution willingly or unwillingly. Exploratory in-depth interviews conducted with 12 experts from various stakeholder groups informed the researchers about variables influencing compliance levels, including awareness and apprehension, willingness to comply, ability to comply and effectiveness of controls. Semi-structured expert interviews were evaluated using qualitative content analysis. A model of determinants of compliance was developed and is presented here. While most vessel operators endeavour to achieve full compliance with environmental rules, a lack of availability of technical solutions, expediency of implementation and operation and economic feasibility might prove a hindrance. Ineffective control systems on the other hand foster willing non-compliance. With respect to motivations, lacking time, lacking financials and the absence of commercial advantages decrease compliance levels. These and other variables were inductively developed from qualitative data and integrated into a model on environmental compliance. The outcomes presented here form part of a wider research project on economic effects of maritime environmental legislation. Research on determinants of compliance might inform policy-makers about actual behavioural effects of shipping companies and might further the development of a comprehensive legal system for environmental protection.

Keywords: compliance, marine environmental protection, exploratory qualitative research study, clean vessel operations, North and Baltic Sea area

Procedia PDF Downloads 378
194 3D Dentofacial Surgery Full Planning Procedures

Authors: Oliveira M., Gonçalves L., Francisco I., Caramelo F., Vale F., Sanz D., Domingues M., Lopes M., Moreia D., Lopes T., Santos T., Cardoso H.

Abstract:

The ARTHUR project consists of a platform that allows the virtual performance of maxillofacial surgeries, offering, in a photorealistic concept, the possibility for the patient to have an idea of the surgical changes before they are performed on their face. For this, the system brings together several image formats, dicoms and objs that, after loading, will generate the bone volume, soft tissues and hard tissues. The system also incorporates the patient's stereophotogrammetry, in addition to their data and clinical history. After loading and inserting data, the clinician can virtually perform the surgical operation and present the final result to the patient, generating a new facial surface that contemplates the changes made in the bone and tissues of the maxillary area. This tool acts in different situations that require facial reconstruction, however this project focuses specifically on two types of use cases: bone congenital disfigurement and acquired disfiguration such as oral cancer with bone attainment. Being developed a cloud based solution, with mobile support, the tool aims to reduce the decision time window of patient. Because the current simulations are not realistic or, if realistic, need time due to the need of building plaster models, patient rates on decision, rely on a long time window (1,2 months), because they don’t identify themselves with the presented surgical outcome. On the other hand, this planning was performed time based on average estimated values of the position of the maxilla and mandible. The team was based on averages of the facial measurements of the population, without specifying racial variability, so the proposed solution was not adjusted to the real individual physiognomic needs.

Keywords: 3D computing, image processing, image registry, image reconstruction

Procedia PDF Downloads 200
193 Analysis of Pangasinan State University: Bayambang Students’ Concerns Through Social Media Analytics and Latent Dirichlet Allocation Topic Modelling Approach

Authors: Matthew John F. Sino Cruz, Sarah Jane M. Ferrer, Janice C. Francisco

Abstract:

COVID-19 pandemic has affected more than 114 countries all over the world since it was considered a global health concern in 2020. Different sectors, including education, have shifted to remote/distant setups to follow the guidelines set to prevent the spread of the disease. One of the higher education institutes which shifted to remote setup is the Pangasinan State University (PSU). In order to continue providing quality instructions to the students, PSU designed Flexible Learning Model to still provide services to its stakeholders amidst the pandemic. The model covers the redesigning of delivering instructions in remote setup and the technology needed to support these adjustments. The primary goal of this study is to determine the insights of the PSU – Bayambang students towards the remote setup implemented during the pandemic and how they perceived the initiatives employed in relation to their experiences in flexible learning. In this study, the topic modelling approach was implemented using Latent Dirichlet Allocation. The dataset used in the study. The results show that the most common concern of the students includes time and resource management, poor internet connection issues, and difficulty coping with the flexible learning modality. Furthermore, the findings of the study can be used as one of the bases for the administration to review and improve the policies and initiatives implemented during the pandemic in relation to remote service delivery. In addition, further studies can be conducted to determine the overall sentiment of the other stakeholders in the policies implemented at the University.

Keywords: COVID-19, topic modelling, students’ sentiment, flexible learning, Latent Dirichlet allocation

Procedia PDF Downloads 118
192 Comprehensive Longitudinal Multi-omic Profiling in Weight Gain and Insulin Resistance

Authors: Christine Y. Yeh, Brian D. Piening, Sarah M. Totten, Kimberly Kukurba, Wenyu Zhou, Kevin P. F. Contrepois, Gucci J. Gu, Sharon Pitteri, Michael Snyder

Abstract:

Three million deaths worldwide are attributed to obesity. However, the biomolecular mechanisms that describe the link between adiposity and subsequent disease states are poorly understood. Insulin resistance characterizes approximately half of obese individuals and is a major cause of obesity-mediated diseases such as Type II diabetes, hypertension and other cardiovascular diseases. This study makes use of longitudinal quantitative and high-throughput multi-omics (genomics, epigenomics, transcriptomics, glycoproteomics etc.) methodologies on blood samples to develop multigenic and multi-analyte signatures associated with weight gain and insulin resistance. Participants of this study underwent a 30-day period of weight gain via excessive caloric intake followed by a 60-day period of restricted dieting and return to baseline weight. Blood samples were taken at three different time points per patient: baseline, peak-weight and post weight loss. Patients were characterized as either insulin resistant (IR) or insulin sensitive (IS) before having their samples processed via longitudinal multi-omic technologies. This comparative study revealed a wealth of biomolecular changes associated with weight gain after using methods in machine learning, clustering, network analysis etc. Pathways of interest included those involved in lipid remodeling, acute inflammatory response and glucose metabolism. Some of these biomolecules returned to baseline levels as the patient returned to normal weight whilst some remained elevated. IR patients exhibited key differences in inflammatory response regulation in comparison to IS patients at all time points. These signatures suggest differential metabolism and inflammatory pathways between IR and IS patients. Biomolecular differences associated with weight gain and insulin resistance were identified on various levels: in gene expression, epigenetic change, transcriptional regulation and glycosylation. This study was not only able to contribute to new biology that could be of use in preventing or predicting obesity-mediated diseases, but also matured novel biomedical informatics technologies to produce and process data on many comprehensive omics levels.

Keywords: insulin resistance, multi-omics, next generation sequencing, proteogenomics, type ii diabetes

Procedia PDF Downloads 425
191 Utilization of Standard Paediatric Observation Chart to Evaluate Infants under Six Months Presenting with Non-Specific Complaints

Authors: Michael Zhang, Nicholas Marriage, Valerie Astle, Marie-Louise Ratican, Jonathan Ash, Haddijatou Hughes

Abstract:

Objective: Young infants are often brought to the Emergency Department (ED) with a variety of complaints, some of them are non-specific and present as a diagnostic challenge to the attending clinician. Whilst invasive investigations such as blood tests and lumbar puncture are necessary in some cases to exclude serious infections, some basic clinical tools in additional to thorough clinical history can be useful to assess the risks of serious conditions in these young infants. This study aimed to examine the utilization of one of clinical tools in this regard. Methods: This retrospective observational study examined the medical records of infants under 6 months presenting to a mixed urban ED between January 2013 and December 2014. The infants deemed to have non-specific complaints or diagnoses by the emergency clinicians were selected for analysis. The ones with clear systemic diagnoses were excluded. Among all relevant clinical information and investigation results, utilization of Standard Paediatric Observation Chart (SPOC) was particularly scrutinized in these medical records. This specific chart was developed by the expert clinicians in local health department. It categorizes important clinical signs into some color-coded zones as a visual cue for serious implication of some abnormalities. An infant is regarded as SPOC positive when fulfills 1 red zone or 2 yellow zones criteria, and the attending clinician would be prompted to investigate and treat for potential serious conditions accordingly. Results: Eight hundred and thirty-five infants met the inclusion criteria for this project. The ones admitted to the hospital for further management were more likely to have SPOC positive criteria than the discharged infants (Odds ratio: 12.26, 95% CI: 8.04 – 18.69). Similarly, Sepsis alert criteria on SPOC were positive in a higher percentage of patients with serious infections (56.52%) in comparison to those with mild conditions (15.89%) (p < 0.001). The SPOC sepsis criteria had a sensitivity of 56.5% (95% CI: 47.0% - 65.7%) and a moderate specificity of 84.1% (95% CI: 80.8% - 87.0%) to identify serious infections. Applying to this infant population, with a 17.4% prevalence of serious infection, the positive predictive value was only 42.8% (95% CI: 36.9% - 49.0%). However, the negative predictive value was high at 90.2% (95% CI: 88.1% - 91.9%). Conclusions: Standard Paediatric Observation Chart has been applied as a useful clinical tool in the clinical practice to help identify and manage young sick infants in ED effectively.

Keywords: clinical tool, infants, non-specific complaints, Standard Paediatric Observation Chart

Procedia PDF Downloads 244
190 Simons, Ehrlichs and the Case for Polycentricity – Why Growth-Enthusiasts and Growth-Sceptics Must Embrace Polycentricity

Authors: Justus Enninga

Abstract:

Enthusiasts and skeptics about economic growth have not much in common in their preference for institutional arrangements that solve ecological conflicts. This paper argues that agreement between both opposing schools can be found in the Bloomington Schools’ concept of polycentricity. Growth-enthusiasts who will be referred to as Simons after the economist Julian Simon and growth-skeptics named Ehrlichs after the ecologist Paul R. Ehrlich both profit from a governance structure where many officials and decision structures are assigned limited and relatively autonomous prerogatives to determine, enforce and alter legal relationships. The paper advances this argument in four steps. First, it will provide clarification of what Simons and Ehrlichs mean when they talk about growth and what the arguments for and against growth-enhancing or degrowth policies are for them and for the other site. Secondly, the paper advances the concept of polycentricity as first introduced by Michael Polanyi and later refined to the study of governance by the Bloomington School of institutional analysis around the Nobel Prize laureate Elinor Ostrom. The Bloomington School defines polycentricity as a non-hierarchical, institutional, and cultural framework that makes possible the coexistence of multiple centers of decision making with different objectives and values, that sets the stage for an evolutionary competition between the complementary ideas and methods of those different decision centers. In the third and fourth parts, it is shown how the concept of polycentricity is of crucial importance for growth-enthusiasts and growth-skeptics alike. The shorter third part demonstrates the literature on growth-enhancing policies and argues that large parts of the literature already accept that polycentric forms of governance like markets, the rule of law and federalism are an important part of economic growth. Part four delves into the more nuanced question of how a stagnant steady-state economy or even an economy that de-grows will still find polycentric governance desirable. While the majority of degrowth proposals follow a top-down approach by requiring direct governmental control, a contrasting bottom-up approach is advanced. A decentralized, polycentric approach is desirable because it allows for the utilization of tacit information dispersed in society and an institutionalized discovery process for new solutions to the problem of ecological collective action – no matter whether you belong to the Simons or Ehrlichs in a green political economy.

Keywords: degrowth, green political theory, polycentricity, institutional robustness

Procedia PDF Downloads 180
189 Investigations of the Service Life of Different Material Configurations at Solid-lubricated Rolling Bearings

Authors: Bernd Sauer, Michel Werner, Stefan Emrich, Michael Kopnarski, Oliver Koch

Abstract:

Friction reduction is an important aspect in the context of sustainability and energy transition. Rolling bearings are therefore used in many applications in which components move relative to each other. Conventionally lubricated rolling bearings are used in a wide range of applications, but are not suitable under certain conditions. Conventional lubricants such as grease or oil cannot be used at very high or very low temperatures. In addition, these lubricants evaporate at very low ambient pressure, e.g. in a high vacuum environment, making the use of solid lubricated bearings unavoidable. With the use of solid-lubricated bearings, predicting the service life becomes more complex. While the end of the service life of bearings with conventional lubrication is mainly caused by the failure of the bearing components due to material fatigue, solid-lubricated bearings fail at the moment when the lubrication layer is worn and the rolling elements come into direct contact with the raceway during operation. In order to extend the service life of these bearings beyond the service life of the initial coating, the use of transfer lubrication is recommended, in which pockets or sacrificial cages are used in which the balls run and can thus absorb the lubricant, which is then available for lubrication in tribological contact. This contribution presents the results of wear and service life tests on solid-lubricated rolling bearings with sacrificial cage pockets. The cage of the bearing consists of a polyimide (PI) matrix with 15% molybdenum disulfide (MoS2) and serves as a lubrication depot alongside the silver-coated balls. The bearings are tested under high vacuum (pE < 10-2 Pa) at a temperature of 300 °C on a four-bearing test rig. First, investigations of the bearing system within the bearing service life are presented and the torque curve, the wear mass and surface analyses are discussed. With regard to wear, it can be seen that the bearing rings tend to increase in mass over the service life of the bearing, while the balls and the cage tend to lose mass. With regard to the elementary surface properties, the surfaces of the bearing rings and balls are examined in terms of the mass of the elements on them. Furthermore, service life investigations with different material pairings are presented, whereby the focus here is on the service life achieved in addition to the torque curve, wear development and surface analysis. It was shown that MoS2 in the cage leads to a longer service life, while a silver (Ag) coating on the balls has no positive influence on the service life and even appears to reduce it in combination with MoS2.

Keywords: ball bearings, molybdenum disulfide, solid lubricated bearings, solid lubrication mechanisms

Procedia PDF Downloads 42
188 Comprehensive Geriatric Assessments: An Audit into Assessing and Improving Uptake on Geriatric Wards at King’s College Hospital, London

Authors: Michael Adebayo, Saheed Lawal

Abstract:

The Comprehensive Geriatric Assessment (CGA) is the multidimensional tool used to assess elderly, frail patients either on admission to hospital care or at a community level in primary care. It is a tool designed with the aim of using a holistic approach to managing patients. A Cochrane review of CGA use in 2011 found that the likelihood of being alive and living in their own home rises by 30% post-discharge. RCTs have also discovered 10–15% reductions in readmission rates and reductions in institutionalization, and resource use and costs. Past audit cycles at King’s College Hospital, Denmark Hill had shown inconsistent evidence of CGA completion inpatient discharge summaries (less than 50%). Junior Doctors in the Health and Ageing (HAU) wards have struggled to sustain the efforts of past audit cycles due to the quick turnover in staff (four-month placements for trainees). This 7th cycle created a multi-faceted approach to solving this problem amongst staff and creating lasting change. Methods: 1. We adopted multidisciplinary team involvement to support Doctors. MDT staff e.g. Nurses, Physiotherapists, Occupational Therapists and Dieticians, were actively encouraged to fill in the CGA document. 2. We added a CGA Document Pro-forma to “Sunrise EPR” (Trust computer system). These CGAs were to automatically be included the discharge summary. 3. Prior to assessing uptake, we used a spot audit questionnaire to assess staff awareness/knowledge of what a CGA was. 4. We designed and placed posters highlighting domains of CGA and MDT roles suited to each domain on geriatric “Health and Ageing Wards” (HAU) in the hospital. 5. We performed an audit of % discharge summaries which include CGA and MDT role input. 6. We nominated ward champions on each ward from each multidisciplinary specialty to monitor and encourage colleagues to actively complete CGAs. 7. We initiated further education of ward staff on CGA's importance by discussion at board rounds and weekly multidisciplinary meetings. Outcomes: 1. The majority of respondents to our spot audit were aware of what a CGA was, but fewer had used the EPR document to complete one. 2. We found that CGAs were not being commenced for nearly 50% of patients discharged on HAU wards and the Frailty Assessment Unit.

Keywords: comprehensive geriatric assessment, CGA, multidisciplinary team, quality of life, mortality

Procedia PDF Downloads 79
187 Plural Perspectives in Conservation Conflicts: The Role of Iconic Species

Authors: Jean Hugé, Francisco Benitez-Capistros, Giorgia Camperio-Ciani

Abstract:

Addressing conservation conflicts requires the consideration of multiple stakeholders' perspectives and knowledge claims, in order to inform complex and possibly contentious decision-making dilemmas. Hence, a better understanding of why people in particular contexts act in a particular way in a conservation conflict is needed. First, this contribution aims at providing and applying an approach to map and interpret the diversity of subjective viewpoints with regard to iconic species in conservation conflicts. Secondly, this contribution aims to feed the reflection on the possible consequences of the diversity of perspectives for the future management of wildlife (in particular iconic species), based on case studies in Galapagos and Malaysia. The use of the semi-quantitative Q methodology allowed us to identify various perspectives on conservation in different social-ecological contexts. While the presence of iconic species may lead to a more passionate and emotional debate, it may also provide more opportunities for finding common ground and for jointly developing acceptable management solutions that will depolarize emergent, long-lasting or latent conservation conflicts. Based on the research team’s experience in the field, and on the integration of ecological and social knowledge, methodological and management recommendations are made with regard to conservation conflicts involving iconic wildlife. The mere presence of iconic wildlife does not guarantee its centrality in conservation conflicts, and comparisons will be drawn between the cases of the giant tortoises (Chelonoidis spec.) in Galapagos, Ecuador and the Milky Stork (Mycteria cinerea) in western peninsular Malaysia. Acknowledging the diversity of viewpoints, reflecting how different stakeholders see, act and talk about wildlife management, highlights the need to develop pro-active and resilient strategies to deal with these issues.

Keywords: conservation conflicts, Q methodology, Galapagos, Malaysia, giant tortoise, milky stork

Procedia PDF Downloads 272
186 Quality Assurance for the Climate Data Store

Authors: Judith Klostermann, Miguel Segura, Wilma Jans, Dragana Bojovic, Isadora Christel Jimenez, Francisco Doblas-Reyees, Judit Snethlage

Abstract:

The Climate Data Store (CDS), developed by the Copernicus Climate Change Service (C3S) implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Union, is intended to become a key instrument for exploring climate data. The CDS contains both raw and processed data to provide information to the users about the past, present and future climate of the earth. It allows for easy and free access to climate data and indicators, presenting an important asset for scientists and stakeholders on the path for achieving a more sustainable future. The C3S Evaluation and Quality Control (EQC) is assessing the quality of the CDS by undertaking a comprehensive user requirement assessment to measure the users’ satisfaction. Recommendations will be developed for the improvement and expansion of the CDS datasets and products. User requirements will be identified on the fitness of the datasets, the toolbox, and the overall CDS service. The EQC function of the CDS will help C3S to make the service more robust: integrated by validated data that follows high-quality standards while being user-friendly. This function will be closely developed with the users of the service. Through their feedback, suggestions, and contributions, the CDS can become more accessible and meet the requirements for a diverse range of users. Stakeholders and their active engagement are thus an important aspect of CDS development. This will be achieved with direct interactions with users such as meetings, interviews or workshops as well as different feedback mechanisms like surveys or helpdesk services at the CDS. The results provided by the users will be categorized as a function of CDS products so that their specific interests will be monitored and linked to the right product. Through this procedure, we will identify the requirements and criteria for data and products in order to build the correspondent recommendations for the improvement and expansion of the CDS datasets and products.

Keywords: climate data store, Copernicus, quality, user engagement

Procedia PDF Downloads 143
185 Efficient Estimation of Maximum Theoretical Productivity from Batch Cultures via Dynamic Optimization of Flux Balance Models

Authors: Peter C. St. John, Michael F. Crowley, Yannick J. Bomble

Abstract:

Production of chemicals from engineered organisms in a batch culture typically involves a trade-off between productivity, yield, and titer. However, strategies for strain design typically involve designing mutations to achieve the highest yield possible while maintaining growth viability. Such approaches tend to follow the principle of designing static networks with minimum metabolic functionality to achieve desired yields. While these methods are computationally tractable, optimum productivity is likely achieved by a dynamic strategy, in which intracellular fluxes change their distribution over time. One can use multi-stage fermentations to increase either productivity or yield. Such strategies would range from simple manipulations (aerobic growth phase, anaerobic production phase), to more complex genetic toggle switches. Additionally, some computational methods can also be developed to aid in optimizing two-stage fermentation systems. One can assume an initial control strategy (i.e., a single reaction target) in maximizing productivity - but it is unclear how close this productivity would come to a global optimum. The calculation of maximum theoretical yield in metabolic engineering can help guide strain and pathway selection for static strain design efforts. Here, we present a method for the calculation of a maximum theoretical productivity of a batch culture system. This method follows the traditional assumptions of dynamic flux balance analysis: that internal metabolite fluxes are governed by a pseudo-steady state and external metabolite fluxes are represented by dynamic system including Michealis-Menten or hill-type regulation. The productivity optimization is achieved via dynamic programming, and accounts explicitly for an arbitrary number of fermentation stages and flux variable changes. We have applied our method to succinate production in two common microbial hosts: E. coli and A. succinogenes. The method can be further extended to calculate the complete productivity versus yield Pareto surface. Our results demonstrate that nearly optimal yields and productivities can indeed be achieved with only two discrete flux stages.

Keywords: A. succinogenes, E. coli, metabolic engineering, metabolite fluxes, multi-stage fermentations, succinate

Procedia PDF Downloads 211
184 Exploring Mothers' Knowledge and Experiences of Attachment in the First 1000 Days of Their Child's Life

Authors: Athena Pedro, Zandile Batweni, Laura Bradfield, Michael Dare, Ashley Nyman

Abstract:

The rapid growth and development of an infant in the first 1000 days of life means that this time period provides the greatest opportunity for a positive developmental impact on a child’s life socially, emotionally, cognitively and physically. Current research is being focused on children in the first 1000 days, but there is a lack of research and understanding of mothers and their experiences during this crucial time period. Thus, it is imperative that more research is done to help better understand the experiences of mothers during the first 1000 days of their child’s life, as well as gain more insight into mothers’ knowledge regarding this time period. The first 1000 days of life, from conception to two years, is a critical period, and the child’s attachment to his or her mother or primary caregiver during this period is crucial for a multitude of future outcomes. The aim of this study was to explore mothers’ understanding and experience of the first 1000 days of their child’s life, specifically looking at attachment in the context of Bowlby and Ainsworths’ attachment theory. Using a qualitative methodological framework, data were collected through semi-structured individual interviews with 12 first-time mothers from low-income communities in Cape Town. Thematic analysis of the data revealed that mothers articulated the importance of attachment within the first 1000 days of life and shared experiences of how they bond and form attachment with their babies. Furthermore, these mothers expressed their belief in the long-term effects of early attachment of responsive positive parenting as well as the lasting effects of poor attachment and non-responsive parenting. This study has implications for new mothers and healthcare staff working with mothers of new-born babies, as well as for future contextual research. By gaining insight into the mothers’ experiences, policies and intervention efforts can be formulated in order to assist mothers during this time, which ultimately promote the healthy development of the nation’s children and future adult generation. If researchers are also able to understand the extent of mothers’ general knowledge regarding the first 1000 days and attachment, then there will be a better understanding of where there may be gaps in knowledge and thus, recommendations for effective and relevant intervention efforts may be provided. These interventions may increase knowledge and awareness of new mothers and health care workers at clinics and other service providers, creating a high impact on positive outcome. Thus, improving the developmental trajectory for many young babies allows them the opportunity to pursue optimal development by reaching their full potential.

Keywords: attachment, experience, first 1000 days, knowledge, mothers

Procedia PDF Downloads 172
183 Fabrication of Superhydrophobic Galvanized Steel by Sintering Zinc Nanopowder

Authors: Francisco Javier Montes Ruiz-Cabello, Guillermo Guerrero-Vacas, Sara Bermudez-Romero, Miguel Cabrerizo Vilchez, Miguel Angel Rodriguez-Valverde

Abstract:

Galvanized steel is one of the widespread metallic materials used in industry. It consists on a iron-based alloy (steel) coated with a layer of zinc with variable thickness. The zinc is aimed to prevent the inner steel from corrosion and staining. Its production is cheaper than the stainless steel and this is the reason why it is employed in the construction of materials with large dimensions in aeronautics, urban/ industrial edification or ski-resorts. In all these applications, turning the natural hydrophilicity of the metal surface into superhydrophobicity is particularly interesting and would open a wide variety of additional functionalities. However, producing a superhydrophobic surface on galvanized steel may be a very difficult task. Superhydrophobic surfaces are characterized by a specific surface texture which is reached either by coating the surface with a material that incorporates such texture, or by conducting several roughening methods. Since galvanized steel is already a coated material, the incorporation of a second coating may be undesired. On the other hand, the methods that are recurrently used to incorporate the surface texture leading to superhydrophobicity in metals are aggressive and may damage their surface. In this work, we used a novel strategy which goal is to produce superhydrophobic galvanized steel by a two-step non-aggressive process. The first process is aimed to create a hierarchical structure by incorporating zinc nanoparticles sintered on the surface at a temperature slightly lower than the zinc’s melting point. The second one is a hydrophobization by a thick fluoropolymer layer deposition. The wettability of the samples is characterized in terms of tilting plate and bouncing drop experiments, while the roughness is analyzed by confocal microscopy. The durability of the produced surfaces was also explored.

Keywords: galvanaized steel, superhydrophobic surfaces, sintering nanoparticles, zinc nanopowder

Procedia PDF Downloads 144
182 Improving the Technology of Assembly by Use of Computer Calculations

Authors: Mariya V. Yanyukina, Michael A. Bolotov

Abstract:

Assembling accuracy is the degree of accordance between the actual values of the parameters obtained during assembly, and the values specified in the assembly drawings and technical specifications. However, the assembling accuracy depends not only on the quality of the production process but also on the correctness of the assembly process. Therefore, preliminary calculations of assembly stages are carried out to verify the correspondence of real geometric parameters to their acceptable values. In the aviation industry, most calculations involve interacting dimensional chains. This greatly complicates the task. Solving such problems requires a special approach. The purpose of this article is to carry out the problem of improving the technology of assembly of aviation units by use of computer calculations. One of the actual examples of the assembly unit, in which there is an interacting dimensional chain, is the turbine wheel of gas turbine engine. Dimensional chain of turbine wheel is formed by geometric parameters of disk and set of blades. The interaction of the dimensional chain consists in the formation of two chains. The first chain is formed by the dimensions that determine the location of the grooves for the installation of the blades, and the dimensions of the blade roots. The second dimensional chain is formed by the dimensions of the airfoil shroud platform. The interaction of the dimensional chain of the turbine wheel is the interdependence of the first and second chains by means of power circuits formed by a plurality of middle parts of the turbine blades. The timeliness of the calculation of the dimensional chain of the turbine wheel is the need to improve the technology of assembly of this unit. The task at hand contains geometric and mathematical components; therefore, its solution can be implemented following the algorithm: 1) research and analysis of production errors by geometric parameters; 2) development of a parametric model in the CAD system; 3) creation of set of CAD-models of details taking into account actual or generalized distributions of errors of geometrical parameters; 4) calculation model in the CAE-system, loading of various combinations of models of parts; 5) the accumulation of statistics and analysis. The main task is to pre-simulate the assembly process by calculating the interacting dimensional chains. The article describes the approach to the solution from the point of view of mathematical statistics, implemented in the software package Matlab. Within the framework of the study, there are data on the measurement of the components of the turbine wheel-blades and disks, as a result of which it is expected that the assembly process of the unit will be optimized by solving dimensional chains.

Keywords: accuracy, assembly, interacting dimension chains, turbine

Procedia PDF Downloads 370
181 Relativity in Toddlers' Understanding of the Physical World as Key to Misconceptions in the Science Classroom

Authors: Michael Hast

Abstract:

Within their first year, infants can differentiate between objects based on their weight. By at least 5 years children hold consistent weight-related misconceptions about the physical world, such as that heavy things fall faster than lighter ones because of their weight. Such misconceptions are seen as a challenge for science education since they are often highly resistant to change through instruction. Understanding the time point of emergence of such ideas could, therefore, be crucial for early science pedagogy. The paper thus discusses two studies that jointly address the issue by examining young children’s search behaviour in hidden displacement tasks under consideration of relative object weight. In both studies, they were tested with a heavy or a light ball, and they either had information about one of the balls only or both. In Study 1, 88 toddlers aged 2 to 3½ years watched a ball being dropped into a curved tube and were then allowed to search for the ball in three locations – one straight beneath the tube entrance, one where the curved tube lead to, and one that corresponded to neither of the previous outcomes. Success and failure at the task were not impacted by weight of the balls alone in any particular way. However, from around 3 years onwards, relative lightness, gained through having tactile experience of both balls beforehand, enhanced search success. Conversely, relative heaviness increased search errors such that children increasingly searched in the location immediately beneath the tube entry – known as the gravity bias. In Study 2, 60 toddlers aged 2, 2½ and 3 years watched a ball roll down a ramp and behind a screen with four doors, with a barrier placed along the ramp after one of four doors. Toddlers were allowed to open the doors to find the ball. While search accuracy generally increased with age, relative weight did not play a role in 2-year-olds’ search behaviour. Relative lightness improved 2½-year-olds’ searches. At 3 years, both relative lightness and relative heaviness had a significant impact, with the former improving search accuracy and the latter reducing it. Taken together, both studies suggest that between 2 and 3 years of age, relative object weight is increasingly taken into consideration in navigating naïve physical concepts. In particular, it appears to contribute to the early emergence of misconceptions relating to object weight. This insight from developmental psychology research may have consequences for early science education and related pedagogy towards early conceptual change.

Keywords: conceptual development, early science education, intuitive physics, misconceptions, object weight

Procedia PDF Downloads 187
180 Microbiological Assessment of Soft Cheese (Wara), Raw Milk and Dairy Drinking Water from Selected Farms in Ido, Ibadan, Nigeria

Authors: Blessing C. Nwachukwu, Michael O. Taiwo, Wasiu A. Abibu, Isaac O. Ayodeji

Abstract:

Milk is an important source of micro and macronutrients for humans. Soft Cheese (Wara) is an example of a by-product of milk. In addition, water is considered as one of the most vital resources in cattle farms. Due to the high consumption rate of milk and soft cheese and the traditional techniques involved in their production in Nigeria, there was a need for a microbiological assessment which will be of utmost public health importance. The study thus investigated microbial risk assessments associated with consumption of milk and soft cheese (Wara). It also investigated common pathogens present in dairy water in farms and antibiotic sensitivity profiling for implicated pathogens were conducted. Samples were collected from three different Fulani dairy herds in Ido local government, Ibadan, Oyo State, Nigeria and subjected to microbiological evaluation and antimicrobial susceptibility testing. Aspergillus flavus was the only isolated fungal isolate from Wara while Staphylococcus aureus, Vibro cholera, Hafnia alvei, Proteus mirabilis, Escherishia coli, Psuedomonas aeuroginosa, Citrobacter freundii, and Klebsiella pneumonia were the bacteria genera isolated from Wara, dairy milk and dairy drinking water. Bacterial counts from Wara from the three selected farms A, B and C were 3.5×105 CFU/ml, 4.0×105 CFU/ml and 5.3×105 CFU/ml respectively while the fungal count was 3CFU/100µl. The total bacteria count from dairy milk from the three selected farms A, B and C were Farms 2.0 ×105 CFU/ml, 3.5 × 105 CFU/ml and 6.5 × 105 CFU/ml respectively. 1.4×105 CFU/ml, 1.9×105 CFU/ml and 4.9×105 CFU/ml were the recorded bacterial counts from dairy water from farms A, B and C respectively. The highest antimicrobial resistance of 100% was recorded in Wara with Enrofloxacin, Gentamycin, Cefatriaxone and Colistin. The highest antimicrobial susceptibility of 100% was recorded in Raw milk with Enrofloxacin and Gentamicin. Highest antimicrobial intermediate response of 100% was recorded in Raw milk with Streptomycin. The study revealed that most of the cheeses sold at Ido local Government are contaminated with pathogens. Further research is needed on standardizing the production method to prevent pathogens from gaining access. The presence of bacteria in raw milk indicated contamination due to poor handling and unhygienic practices. Thus, drinking unpasteurized milk is hazardous as it increases the risk of zoonoses. Also, the Provision of quality drinking water is crucial for optimum productivity of dairy. Health education programs aiming at increasing awareness of the importance of clean water for animal health will be helpful.

Keywords: dairy, raw milk, soft cheese, Wara

Procedia PDF Downloads 173
179 Considering Uncertainties of Input Parameters on Energy, Environmental Impacts and Life Cycle Costing by Monte Carlo Simulation in the Decision Making Process

Authors: Johannes Gantner, Michael Held, Matthias Fischer

Abstract:

The refurbishment of the building stock in terms of energy supply and efficiency is one of the major challenges of the German turnaround in energy policy. As the building sector accounts for 40% of Germany’s total energy demand, additional insulation is key for energy efficient refurbished buildings. Nevertheless the energetic benefits often the environmental and economic performances of insulation materials are questioned. The methods Life Cycle Assessment (LCA) as well as Life Cycle Costing (LCC) can form the standardized basis for answering this doubts and more and more become important for material producers due efforts such as Product Environmental Footprint (PEF) or Environmental Product Declarations (EPD). Due to increasing use of LCA and LCC information for decision support the robustness and resilience of the results become crucial especially for support of decision and policy makers. LCA and LCC results are based on respective models which depend on technical parameters like efficiencies, material and energy demand, product output, etc.. Nevertheless, the influence of parameter uncertainties on lifecycle results are usually not considered or just studied superficially. Anyhow the effect of parameter uncertainties cannot be neglected. Based on the example of an exterior wall the overall lifecycle results are varying by a magnitude of more than three. As a result simple best case worst case analyses used in practice are not sufficient. These analyses allow for a first rude view on the results but are not taking effects into account such as error propagation. Thereby LCA practitioners cannot provide further guidance for decision makers. Probabilistic analyses enable LCA practitioners to gain deeper understanding of the LCA and LCC results and provide a better decision support. Within this study, the environmental and economic impacts of an exterior wall system over its whole lifecycle are illustrated, and the effect of different uncertainty analysis on the interpretation in terms of resilience and robustness are shown. Hereby the approaches of error propagation and Monte Carlo Simulations are applied and combined with statistical methods in order to allow for a deeper understanding and interpretation. All in all this study emphasis the need for a deeper and more detailed probabilistic evaluation based on statistical methods. Just by this, misleading interpretations can be avoided, and the results can be used for resilient and robust decisions.

Keywords: uncertainty, life cycle assessment, life cycle costing, Monte Carlo simulation

Procedia PDF Downloads 282
178 Balanced Scorecard (BSC) Project : A Methodological Proposal for Decision Support in a Corporate Scenario

Authors: David de Oliveira Costa, Miguel Ângelo Lellis Moreira, Carlos Francisco Simões Gomes, Daniel Augusto de Moura Pereira, Marcos dos Santos

Abstract:

Strategic management is a fundamental process for global companies that intend to remain competitive in an increasingly dynamic and complex market. To do so, it is necessary to maintain alignment with their principles and values. The Balanced Scorecard (BSC) proposes to ensure that the overall business performance is based on different perspectives (financial, customer, internal processes, and learning and growth). However, relying solely on the BSC may not be enough to ensure the success of strategic management. It is essential that companies also evaluate and prioritize strategic projects that need to be implemented to ensure they are aligned with the business vision and contribute to achieving established goals and objectives. In this context, the proposition involves the incorporation of the SAPEVO-M multicriteria method to indicate the degree of relevance between different perspectives. Thus, the strategic objectives linked to these perspectives have greater weight in the classification of structural projects. Additionally, it is proposed to apply the concept of the Impact & Probability Matrix (I&PM) to structure and ensure that strategic projects are evaluated according to their relevance and impact on the business. By structuring the business's strategic management in this way, alignment and prioritization of projects and actions related to strategic planning are ensured. This ensures that resources are directed towards the most relevant and impactful initiatives. Therefore, the objective of this article is to present the proposal for integrating the BSC methodology, the SAPEVO-M multicriteria method, and the prioritization matrix to establish a concrete weighting of strategic planning and obtain coherence in defining strategic projects aligned with the business vision. This ensures a robust decision-making support process.

Keywords: MCDA process, prioritization problematic, corporate strategy, multicriteria method

Procedia PDF Downloads 73
177 Spatio-Temporal Analysis of Rabies Incidence in Herbivores of Economic Interest in Brazil

Authors: Francisco Miroslav Ulloa-Stanojlovic, Gina Polo, Ricardo Augusto Dias

Abstract:

In Brazil, there is a high incidence of rabies in herbivores of economic interest (HEI) transmitted by the common vampire bat Desmodus rotundus, the presence of human rabies cases and the huge economic losses in the world's largest cattle industry, it is important to assist the National Program for Control of Rabies in herbivores in Brazil, that aims to reduce the incidence of rabies in HEI populations, mainly through epidemiological surveillance, vaccination of herbivores and control of vampire-bat roosts. Material and Methods: A spatiotemporal retrospective Kulldorff's spatial scan statistic based on a Poisson model and Monte Carlo simulation and an Anselin's Local Moran's I statistic were used to uncover spatial clustering of HEI rabies from 2000 – 2014. Results: Were identify three important clusters with significant year-to-year variation (Figure 1). In 2000, was identified one area of clustering in the North region, specifically in the State of Tocantins. Between the year 2000 and 2004, a cluster centered in the Midwest and Southeast region including the States of Goiás, Minas Gerais, Rio de Janeiro, Espirito Santo and São Paulo was prominent. And finally between 2000 and 2005 was found an important cluster in the North, Midwest and South region. Conclusions: The HEI rabies is endemic in the country, in addition, appears to be significant differences among the States according to their surveillance services, that may be difficulting the control of the disease, also other factors could be influencing in the maintenance of this problem like the lack of information of vampire-bat roosts identification, and limited human resources for realization of field monitoring. A review of the program control by the authorities it’s necessary.

Keywords: Brazil, Desmodus rotundus, herbivores, rabies

Procedia PDF Downloads 413
176 Globally Convergent Sequential Linear Programming for Multi-Material Topology Optimization Using Ordered Solid Isotropic Material with Penalization Interpolation

Authors: Darwin Castillo Huamaní, Francisco A. M. Gomes

Abstract:

The aim of the multi-material topology optimization (MTO) is to obtain the optimal topology of structures composed by many materials, according to a given set of constraints and cost criteria. In this work, we seek the optimal distribution of materials in a domain, such that the flexibility of the structure is minimized, under certain boundary conditions and the intervention of external forces. In the case we have only one material, each point of the discretized domain is represented by two values from a function, where the value of the function is 1 if the element belongs to the structure or 0 if the element is empty. A common way to avoid the high computational cost of solving integer variable optimization problems is to adopt the Solid Isotropic Material with Penalization (SIMP) method. This method relies on the continuous interpolation function, power function, where the base variable represents a pseudo density at each point of domain. For proper exponent values, the SIMP method reduces intermediate densities, since values other than 0 or 1 usually does not have a physical meaning for the problem. Several extension of the SIMP method were proposed for the multi-material case. The one that we explore here is the ordered SIMP method, that has the advantage of not being based on the addition of variables to represent material selection, so the computational cost is independent of the number of materials considered. Although the number of variables is not increased by this algorithm, the optimization subproblems that are generated at each iteration cannot be solved by methods that rely on second derivatives, due to the cost of calculating the second derivatives. To overcome this, we apply a globally convergent version of the sequential linear programming method, which solves a linear approximation sequence of optimization problems.

Keywords: globally convergence, multi-material design ordered simp, sequential linear programming, topology optimization

Procedia PDF Downloads 312
175 Time of Week Intensity Estimation from Interval Censored Data with Application to Police Patrol Planning

Authors: Jiahao Tian, Michael D. Porter

Abstract:

Law enforcement agencies are tasked with crime prevention and crime reduction under limited resources. Having an accurate temporal estimate of the crime rate would be valuable to achieve such a goal. However, estimation is usually complicated by the interval-censored nature of crime data. We cast the problem of intensity estimation as a Poisson regression using an EM algorithm to estimate the parameters. Two special penalties are added that provide smoothness over the time of day and day of the week. This approach presented here provides accurate intensity estimates and can also uncover day-of-week clusters that share the same intensity patterns. Anticipating where and when crimes might occur is a key element to successful policing strategies. However, this task is complicated by the presence of interval-censored data. The censored data refers to the type of data that the event time is only known to lie within an interval instead of being observed exactly. This type of data is prevailing in the field of criminology because of the absence of victims for certain types of crime. Despite its importance, the research in temporal analysis of crime has lagged behind the spatial component. Inspired by the success of solving crime-related problems with a statistical approach, we propose a statistical model for the temporal intensity estimation of crime with censored data. The model is built on Poisson regression and has special penalty terms added to the likelihood. An EM algorithm was derived to obtain maximum likelihood estimates, and the resulting model shows superior performance to the competing model. Our research is in line with the smart policing initiative (SPI) proposed by the Bureau Justice of Assistance (BJA) as an effort to support law enforcement agencies in building evidence-based, data-driven law enforcement tactics. The goal is to identify strategic approaches that are effective in crime prevention and reduction. In our case, we allow agencies to deploy their resources for a relatively short period of time to achieve the maximum level of crime reduction. By analyzing a particular area within cities where data are available, our proposed approach could not only provide an accurate estimate of intensities for the time unit considered but a time-variation crime incidence pattern. Both will be helpful in the allocation of limited resources by either improving the existing patrol plan with the understanding of the discovery of the day of week cluster or supporting extra resources available.

Keywords: cluster detection, EM algorithm, interval censoring, intensity estimation

Procedia PDF Downloads 63
174 The Impact of Technology and Artificial Intelligence on Children in Autism

Authors: Dina Moheb Rashid Michael

Abstract:

A descriptive statistical analysis of the data showed that the most important factor evoking negative attitudes among teachers is student behavior. have been presented as useful models for understanding the risk factors and protective factors associated with the emergence of autistic traits. Although these "syndrome" forms of autism reach clinical thresholds, they appear to be distinctly different from the idiopathic or "non-syndrome" autism phenotype. Most teachers reported that kindergartens did not prepare them for the educational needs of children with autism, particularly in relation to non-verbal skills. The study is important and points the way for improving teacher inclusion education in Thailand. Inclusive education for students with autism is still in its infancy in Thailand. Although the number of autistic children in schools has increased significantly since the Thai government introduced the Education Regulations for Persons with Disabilities Act in 2008, there is a general lack of services for autistic students and their families. This quantitative study used the Teaching Skills and Readiness Scale for Students with Autism (APTSAS) to test the attitudes and readiness of 110 elementary school teachers when teaching students with autism in general education classrooms. To uncover the true nature of these co morbidities, it is necessary to expand the definition of autism to include the cognitive features of the disorder, and then apply this expanded conceptualization to examine patterns of autistic syndromes. This study used various established eye-tracking paradigms to assess the visual and attention performance of children with DS and FXS who meet the autism thresholds defined in the Social Communication Questionnaire. To study whether the autistic profiles of these children are associated with visual orientation difficulties ("sticky attention"), decreased social attention, and increased visual search performance, all of which are hallmarks of the idiopathic autistic child phenotype. Data will be collected from children with DS and FXS, aged 6 to 10 years, and two control groups matched for age and intellectual ability (i.e., children with idiopathic autism).In order to enable a comparison of visual attention profiles, cross-sectional analyzes of developmental trajectories are carried out. Significant differences in the visual-attentive processes underlying the presentation of autism in children with FXS and DS have been suggested, supporting the concept of syndrome specificity. The study provides insights into the complex heterogeneity associated with autism syndrome symptoms and autism itself, with clinical implications for the utility of autism intervention programs in DS and FXS populations.

Keywords: attitude, autism, teachers, sports activities, movement skills, motor skills

Procedia PDF Downloads 51
173 Willingness to Adopt "Green Steel" Products: A Case Study from the Automotive Sector

Authors: Hasan Muslemani, Jeffrey Wilson, Xi Liang, Francisco Ascui, Katharina Kaesehage

Abstract:

This paper aims to examine consumer behaviour towards, and the willingness to adopt, green steel use in the automotive sector, in order to identify potential barriers and opportunities for its widespread adoption. Semi-structured interviews were held with experts from global, regional and country-specific industry associations and automakers. The analysis shows there is a new shift towards lifecycle thinking in the sector, although these efforts have been voluntary and driven by customer and employee pressures rather than regulation. The paper further appraises possible demand for green steel within different vehicle types (based on size and powertrain), and shows that manufacturers of electric heavy-duty vehicles are most likely to adopt green steel in the first instance, given the amount of incorporated steel in the vehicles and the fact that lifecycle emissions lie predominantly in their manufacturing phase. A case for green advanced higher-strength steels (AHSS) can also be made in light-duty passenger vehicles, which may mitigate competition from light-weight alternative materials in terms of cost and greenness (depending on source and utilisation zones). This work builds on a wide sustainability-related literature in the automotive sector and highlights areas in need of urgent action if the sector as a whole were to meet its Paris Agreement climate targets, in particular a need to revisit current CO2 performance regulations to include Scope 1 and Scope 2 emissions, engage in educational green marketing campaigns, and explore innovative market-based mechanisms to bridge the gap between relatively-low carbon abatement costs of steelmaking and high abatement costs of vehicle manufacturing.

Keywords: Green steel, Consumer behaviour, Automotive industry, Environmental sustainability

Procedia PDF Downloads 159
172 MicroRNA-1246 Expression Associated with Resistance to Oncogenic BRAF Inhibitors in Mutant BRAF Melanoma Cells

Authors: Jae-Hyeon Kim, Michael Lee

Abstract:

Intrinsic and acquired resistance limits the therapeutic benefits of oncogenic BRAF inhibitors in melanoma. MicroRNAs (miRNA) regulate the expression of target mRNAs by repressing their translation. Thus, we investigated miRNA expression patterns in melanoma cell lines to identify candidate biomarkers for acquired resistance to BRAF inhibitor. Here, we used Affymetrix miRNA V3.0 microarray profiling platform to compare miRNA expression levels in three cell lines containing BRAF inhibitor-sensitive A375P BRAF V600E cells, their BRAF inhibitor-resistant counterparts (A375P/Mdr), and SK-MEL-2 BRAF-WT cells with intrinsic resistance to BRAF inhibitor. The miRNAs with at least a two-fold change in expression between BRAF inhibitor-sensitive and –resistant cell lines, were identified as differentially expressed. Averaged intensity measurements identified 138 and 217 miRNAs that were differentially expressed by 2 fold or more between: 1) A375P and A375P/Mdr; 2) A375P and SK-MEL-2, respectively. The hierarchical clustering revealed differences in miRNA expression profiles between BRAF inhibitor-sensitive and –resistant cell lines for miRNAs involved in intrinsic and acquired resistance to BRAF inhibitor. In particular, 43 miRNAs were identified whose expression was consistently altered in two BRAF inhibitor-resistant cell lines, regardless of intrinsic and acquired resistance. Twenty five miRNAs were consistently upregulated and 18 downregulated more than 2-fold. Although some discrepancies were detected when miRNA microarray data were compared with qPCR-measured expression levels, qRT-PCR for five miRNAs (miR-3617, miR-92a1, miR-1246, miR-1936-3p, and miR-17-3p) results showed excellent agreement with microarray experiments. To further investigate cellular functions of miRNAs, we examined effects on cell proliferation. Synthetic oligonucleotide miRNA mimics were transfected into three cell lines, and proliferation was quantified using a colorimetric assay. Of the 5 miRNAs tested, only miR-1246 altered cell proliferation of A375P/Mdr cells. The transfection of miR-1246 mimic strongly conferred PLX-4720 resistance to A375P/Mdr cells, implying that miR-1246 upregulation confers acquired resistance to BRAF inhibition. We also found that PLX-4720 caused much greater G2/M arrest in A375P/Mdr cells transfected with miR-1246mimic than that seen in scrambled RNA-transfected cells. Additionally, miR-1246 mimic partially caused a resistance to autophagy induction by PLX-4720. These results indicate that autophagy does play an essential death-promoting role inPLX-4720-induced cell death. Taken together, these results suggest that miRNA expression profiling in melanoma cells can provide valuable information for a network of BRAF inhibitor resistance-associated miRNAs.

Keywords: microRNA, BRAF inhibitor, drug resistance, autophagy

Procedia PDF Downloads 323
171 The Role of Glyceryl Trinitrate (GTN) in 99mTc-HIDA with Morphine Provocation Scan for the Investigation of Type III Sphincter of Oddi Dysfunction (SOD)

Authors: Ibrahim M Hassan, Lorna Que, Michael Rutland

Abstract:

Type I SOD is usually diagnosed by anatomical imaging such as ultrasound, CT and MRCP. However, the types II and III SOD yield negative results despite the presence of significant symptoms. In particular, the type III is difficult to diagnose due to the absence of significant biochemical or anatomical abnormalities. Nuclear Medicine can aid in this diagnostic dilemma by demonstrating functional changes in the bile flow. Low dose Morphine (0.04mg/Kg) stimulates the tone of the sphincter of Oddi (SO) and its usefulness has been shown in diagnosing SOD by causing a delay in bile flow when compared to a non morphine provoked - baseline scan. This work expands on that process by using sublingual GTN at 60 minutes post tracer and morphine injection to relax the SO and induce an improvement in bile outflow, and in some cases show immediate relief of morphine induced abdominal pain. The criteria for positive SOD are as follows: if during the first hour of the morphine provocation showed (1) delayed intrahepatic biliary ducts tracer accumulation; plus (2) delayed appearance but persistent retention of activity in the common bile duct, and (3) delayed bile flow into the duodenum. In addition, patients who required GTN within the first hour to relieve abdominal pain were regarded as highly supportive of the diagnosis. Retrospective analysis of 85 patients (pts) (78F and 6M) referred for suspected SOD (type III) who had been intensively investigated because of recurrent right upper quadrant or abdominal pain post cholecystectomy. 99mTc-HIDA scan with morphine-provocation is performed followed by GTN at 60 minutes post tracer injection and a further thirty minutes of dynamic imaging are acquired. 30 pts were negative. 55 pts were regarded as positive for SOD and 38/55 (60%) of these patients with an abnormal result were further evaluated with a baseline 99mTc-HIDA. As expected, all 38 pts showed better bile flow characteristics than during the morphine provocation. 20/55 (36%) patients were treated by ERCP sphincterotomy and the rest were managed conservatively by medical therapy. In all cases regarded as positive for SOD, the sublingual GTN at 60 minutes showed immediate improvement in bile flow. 11/55(20%) who developed severe post-morphine abdominal pain were relieved by GTN almost instantaneously. We propose that GTN is a useful agent in the diagnosis of SOD when performing 99mTc-HIDA scan and that the satisfactory response to the sublingual GTN could offer additional information in patients who have severe pain at the time the procedure or when presenting to the emergency unit because of biliary pain. And also in determining whether a trial of medical therapy may be used before considering surgery.

Keywords: GTN, HIDA, MORPHINE, SOD

Procedia PDF Downloads 300
170 Reasons for the Selection of Information-Processing Framework and the Philosophy of Mind as a General Account for an Error Analysis and Explanation on Mathematics

Authors: Michael Lousis

Abstract:

This research study is concerned with learner’s errors on Arithmetic and Algebra. The data resulted from a broader international comparative research program called Kassel Project. However, its conceptualisation differed from and contrasted with that of the main program, which was mostly based on socio-demographic data. The way in which the research study was conducted, was not dependent on the researcher’s discretion, but was absolutely dictated by the nature of the problem under investigation. This is because the phenomenon of learners’ mathematical errors is due neither to the intentions of learners nor to institutional processes, rules and norms, nor to the educators’ intentions and goals; but rather to the way certain information is presented to learners and how their cognitive apparatus processes this information. Several approaches for the study of learners’ errors have been developed from the beginning of the 20th century, encompassing different belief systems. These approaches were based on the behaviourist theory, on the Piagetian- constructivist research framework, the perspective that followed the philosophy of science and the information-processing paradigm. The researcher of the present study was forced to disclose the learners’ course of thinking that led them in specific observable actions with the result of showing particular errors in specific problems, rather than analysing scripts with the students’ thoughts presented in a written form. This, in turn, entailed that the choice of methods would have to be appropriate and conducive to seeing and realising the learners’ errors from the perspective of the participants in the investigation. This particular fact determined important decisions to be made concerning the selection of an appropriate framework for analysing the mathematical errors and giving explanations. Thus the rejection of the belief systems concerning behaviourism, the Piagetian-constructivist, and philosophy of science perspectives took place, and the information-processing paradigm in conjunction with the philosophy of mind were adopted as a general account for the elaboration of data. This paper explains why these decisions were appropriate and beneficial for conducting the present study and for the establishment of the ensued thesis. Additionally, the reasons for the adoption of the information-processing paradigm in conjunction with the philosophy of mind give sound and legitimate bases for the development of future studies concerning mathematical error analysis are explained.

Keywords: advantages-disadvantages of theoretical prospects, behavioral prospect, critical evaluation of theoretical prospects, error analysis, information-processing paradigm, opting for the appropriate approach, philosophy of science prospect, Piagetian-constructivist research frameworks, review of research in mathematical errors

Procedia PDF Downloads 187
169 Historic Fire Occurrence in Hemi-Boreal Forests: Exploring Natural and Cultural Scots Pine Multi-Cohort Fire Regimes in Lithuania

Authors: Charles Ruffner, Michael Manton, Gintautas Kibirkstis, Gediminas Brazaitas, Vitas Marozas, Ekaterine Makrickiene, Rutile Pukiene, Per Angelstam

Abstract:

In dynamic boreal forests, fire is an important natural disturbance, which drives regeneration and mortality of living and dead trees, and thus successional trajectories. However, current forest management practices focusing on wood production only have effectively eliminated fire as a stand-level disturbance. While this is generally well studied across much of Europe, in Lithuania, little is known about the historic fire regime and the role fire plays as a management tool towards the sustainable management of future landscapes. Focusing on Scots pine forests, we explore; i) the relevance of fire disturbance regimes on forestlands of Lithuania; ii) fire occurrence in the Dzukija landscape for dry upland and peatland forest sites, and iii) correlate tree-ring data with climate variables to ascertain climatic influences on growth and fire occurrence. We sampled and cross-dated 132 Scots pine samples with fire scars from 4 dry pine forest stands and 4 peatland forest stands, respectively. The fire history of each sample was analyzed using standard dendrochronological methods and presented in FHAES format. Analyses of soil moisture and nutrient conditions revealed a strong probability of finding forests that have a high fire frequency in Scots pine forests (59%), which cover 34.5% of Lithuania’s current forestland. The fire history analysis revealed 455 fire scars and 213 fire events during the period 1742-2019. Within the Dzukija landscape, the mean fire interval was 4.3 years for the dry Scots pine forest and 8.7 years for the peatland Scots pine forest. However, our comparison of fire frequency before and after 1950 shows a marked decrease in mean fire interval. Our data suggest that hemi-boreal forest landscapes of Lithuania provide strong evidence that fire, both human and lightning-ignited fires, has been and should be a natural phenomenon and that the examination of biological archives can be used to guide sustainable forest management into the future. Currently, fire use is prohibited by law as a tool for forest management in Lithuania. We recommend introducing trials that use low-intensity prescribed burning of Scots pine stands as a regeneration tool towards mimicking natural forest disturbance regimes.

Keywords: biodiversity conservation, cultural burning, dendrochronology, forest dynamics, forest management, succession

Procedia PDF Downloads 197
168 Increasing Power Transfer Capacity of Distribution Networks Using Direct Current Feeders

Authors: Akim Borbuev, Francisco de León

Abstract:

Economic and population growth in densely-populated urban areas introduce major challenges to distribution system operators, planers, and designers. To supply added loads, utilities are frequently forced to invest in new distribution feeders. However, this is becoming increasingly more challenging due to space limitations and rising installation costs in urban settings. This paper proposes the conversion of critical alternating current (ac) distribution feeders into direct current (dc) feeders to increase the power transfer capacity by a factor as high as four. Current trends suggest that the return of dc transmission, distribution, and utilization are inevitable. Since a total system-level transformation to dc operation is not possible in a short period of time due to the needed huge investments and utility unreadiness, this paper recommends that feeders that are expected to exceed their limits in near future are converted to dc. The increase in power transfer capacity is achieved through several key differences between ac and dc power transmission systems. First, it is shown that underground cables can be operated at higher dc voltage than the ac voltage for the same dielectric stress in the insulation. Second, cable sheath losses, due to induced voltages yielding circulation currents, that can be as high as phase conductor losses under ac operation, are not present under dc. Finally, skin and proximity effects in conductors and sheaths do not exist in dc cables. The paper demonstrates that in addition to the increased power transfer capacity utilities substituting ac feeders by dc feeders could benefit from significant lower costs and reduced losses. Installing dc feeders is less expensive than installing new ac feeders even when new trenches are not needed. Case studies using the IEEE 342-Node Low Voltage Networked Test System quantify the technical and economic benefits of dc feeders.

Keywords: DC power systems, distribution feeders, distribution networks, power transfer capacity

Procedia PDF Downloads 120