Search results for: extreme conditions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10503

Search results for: extreme conditions

573 Air Pollution on Stroke in Shenzhen, China: A Time-Stratified Case Crossover Study Modified by Meteorological Variables

Authors: Lei Li, Ping Yin, Haneen Khreis

Abstract:

Stroke is the second leading cause of death and a third leading cause of death and disability worldwide in 2019. Given the significant role of environmental factors in stroke development and progression, it is essential to investigate the effect of air pollution on stroke occurrence while considering the modifying effects of meteorological variables. This study aimed to evaluate the association between short-term exposure to air pollution and the incidence of stroke subtypes in Shenzhen, China, and to explore the potential interactions of meteorological factors with air pollutants. The study analyzed data from January 1, 2006, to December 31, 2014, including 88,214 cases of ischemic stroke and 30,433 cases of hemorrhagic stroke among residents of Shenzhen. Using a time-stratified case–crossover design with conditional quasi-Poisson regression, the study estimated the percentage changes in stroke morbidity associated with short-term exposure to nitrogen dioxide (NO₂), sulfur dioxide (SO₂), particulate matter less than 10 mm in aerodynamic diameter (PM10), carbon monoxide (CO), and ozone (O₃). A five-day moving average of air pollution was applied to capture the cumulative effects of air pollution. The estimates were further stratified by sex, age, education level, and season. The additive and multiplicative interaction between air pollutants and meteorologic variables were assessed by the relative excess risk due to interaction (RERI) and adding the interactive term into the main model, respectively. The study found that NO₂ was positively associated with ischemic stroke occurrence throughout the year and in the cold season (November through April), with a stronger effect observed among men. Each 10 μg/m³ increment in the five-day moving average of NO₂ was associated with a 2.38% (95% confidence interval was 1.36% to 3.41%) increase in the risk of ischemic stroke over the whole year and a 3.36% (2.04% to 4.69%) increase in the cold season. The harmful effect of CO on ischemic stroke was observed only in the cold season, with each 1 mg/m³ increment in the five-day moving average of CO increasing the risk by 12.34% (3.85% to 21.51%). There was no statistically significant additive interaction between individual air pollutants and temperature or relative humidity, as demonstrated by the RERI. The interaction term in the model showed a multiplicative antagonistic effect between NO₂ and temperature (p-value=0.0268). For hemorrhagic stroke, no evidence of the effects of any individual air pollutants was found in the whole population. However, the RERI indicated a statistically additive and multiplicative interaction of temperature on the effects of PM10 and O₃ on hemorrhagic stroke onset. Therefore, the insignificant conclusion should be interpreted with caution. The study suggests that environmental NO₂ and CO might increase the morbidity of ischemic stroke, particularly during the cold season. These findings could help inform policy decisions aimed at reducing air pollution levels to prevent stroke and other health conditions. Additionally, the study provides valuable insights into the interaction between air pollution and meteorological variables, which underscores the need for further research into the complex relationship between environmental factors and health.

Keywords: air pollution, meteorological variables, interactive effect, seasonal pattern, stroke

Procedia PDF Downloads 88
572 Topographic and Thermal Analysis of Plasma Polymer Coated Hybrid Fibers for Composite Applications

Authors: Hande Yavuz, Grégory Girard, Jinbo Bai

Abstract:

Manufacturing of hybrid composites requires particular attention to overcome various critical weaknesses that are originated from poor interfacial compatibility. A large number of parameters have to be considered to optimize the interfacial bond strength either to avoid flaw sensitivity or delamination that occurs in composites. For this reason, surface characterization of reinforcement phase is needed in order to provide necessary data to drive an assessment of fiber-matrix interfacial compatibility prior to fabrication of composite structures. Compared to conventional plasma polymerization processes such as radiofrequency and microwave, dielectric barrier discharge assisted plasma polymerization is a promising process that can be utilized to modify the surface properties of carbon fibers in a continuous manner. Finding the most suitable conditions (e.g., plasma power, plasma duration, precursor proportion) for plasma polymerization of pyrrole in post-discharge region either in the presence or in the absence of p-toluene sulfonic acid monohydrate as well as the characterization of plasma polypyrrole coated fibers are the important aspects of this work. Throughout the current investigation, atomic force microscopy (AFM) and thermogravimetric analysis (TGA) are used to characterize plasma treated hybrid fibers (CNT-grafted Toray T700-12K carbon fibers, referred as T700/CNT). TGA results show the trend in the change of decomposition process of deposited polymer on fibers as a function of temperature up to 900 °C. Within the same period of time, all plasma pyrrole treated samples began to lose weight with relatively fast rate up to 400 °C which suggests the loss of polymeric structures. The weight loss between 300 and 600 °C is attributed to evolution of CO2 due to decomposition of functional groups (e.g. carboxyl compounds). With keeping in mind the surface chemical structure, the higher the amount of carbonyl, alcohols, and ether compounds, the lower the stability of deposited polymer. Thus, the highest weight loss is observed in 1400 W 45 s pyrrole+pTSA.H2O plasma treated sample probably because of the presence of less stable polymer than that of other plasma treated samples. Comparison of the AFM images for untreated and plasma treated samples shows that the surface topography may change on a microscopic scale. The AFM image of 1800 W 45 s treated T700/CNT fiber possesses the most significant increase in roughening compared to untreated T700/CNT fiber. Namely, the fiber surface became rougher with ~3.6 fold that of the T700/CNT fiber. The increase observed in surface roughness compared to untreated T700/CNT fiber may provide more contact points between fiber and matrix due to increased surface area. It is believed to be beneficial for their application as reinforcement in composites.

Keywords: hybrid fibers, surface characterization, surface roughness, thermal stability

Procedia PDF Downloads 233
571 The Role of Non-Governmental Organizations in Promoting Humanitarian Development: A Case Study in Saudi Arabia

Authors: Muamar Salameh, Rania Sinno

Abstract:

Non-governmental organizations in Saudi Arabia play a vital role in promoting humanitarian development. Though this paper will emphasize this role and will provide a specific case study on the role of Prince Mohammad Bin Fahd Foundation for Humanitarian Development, yet many organizations do not provide transparent information for the accomplishments of the NGOs. This study will provide answers to the main research question regarding this role that NGOs play in promoting humanitarian development. The recent law regulating associations and foundations in Saudi Arabia was issued in December 2015 and went into effect March 2016. Any new association or foundation will need to follow these regulations. Though the registration, implementation, and workflow of the organizations still need major improvement and development, yet, the currently-registered organizations have several notable achievements. Most of these organizations adopt a centralized administration approach which in many cases still hinders progress and may be an obstacle in achieving and reaching a larger population of beneficiaries. A large portion of the existing organizations are charities, some of which have some sort of government affiliation. The laws and regulations limit registration of new organizations. Any violations to Islamic Sharia, contradictions to public order, breach to national unity, foreign and foreign-affiliation organizations prohibits any organization from registration. The lack of transparency in the operations and inner-working of NGOs in Saudi Arabia is apparent for the public. However, the regulations invoke full transparency with the governing ministry. This transparency should be available to the public and in specific to the target population that are eligible to benefit from the NGOs services. In this study, we will provide an extensive review of all related laws, regulations, policies and procedures related to all NGOs in the Eastern Province of Saudi Arabia. This review will include some examples of current NGOs, services and target population. The study will determine the main accomplishments of reputable NGOs that have impacted positively the Saudi communities. The results will highlight and concentrate on actions, services and accomplishments that achieve sustainable assistance in promoting humanitarian development and advance living conditions of target populations of the Saudi community. In particular, we will concentrate on a case study related to PMFHD; one of the largest foundations in the Eastern Province of Saudi Arabia. The authors have access to the data related to this foundation and have access to the foundation administration to gather, analyze and conclude the findings of this group. The study will also analyze whether the practices, budgets, services and annual accomplishments of the foundation have fulfilled the humanitarian role of the foundation while meeting the governmental requirements, with an analysis in the light of the new laws. The findings of the study show that great accomplishments for advancing and promoting humanitarian development in Saudi community and international communities have been achieved. Several examples will be included from several NGOs, with specific examples from PMFHD.

Keywords: development, foundation, humanitarian, non-governmental organization, Saudi Arabia

Procedia PDF Downloads 296
570 Automatic Identification and Classification of Contaminated Biodegradable Plastics using Machine Learning Algorithms and Hyperspectral Imaging Technology

Authors: Nutcha Taneepanichskul, Helen C. Hailes, Mark Miodownik

Abstract:

Plastic waste has emerged as a critical global environmental challenge, primarily driven by the prevalent use of conventional plastics derived from petrochemical refining and manufacturing processes in modern packaging. While these plastics serve vital functions, their persistence in the environment post-disposal poses significant threats to ecosystems. Addressing this issue necessitates approaches, one of which involves the development of biodegradable plastics designed to degrade under controlled conditions, such as industrial composting facilities. It is imperative to note that compostable plastics are engineered for degradation within specific environments and are not suited for uncontrolled settings, including natural landscapes and aquatic ecosystems. The full benefits of compostable packaging are realized when subjected to industrial composting, preventing environmental contamination and waste stream pollution. Therefore, effective sorting technologies are essential to enhance composting rates for these materials and diminish the risk of contaminating recycling streams. In this study, it leverage hyperspectral imaging technology (HSI) coupled with advanced machine learning algorithms to accurately identify various types of plastics, encompassing conventional variants like Polyethylene terephthalate (PET), Polypropylene (PP), Low density polyethylene (LDPE), High density polyethylene (HDPE) and biodegradable alternatives such as Polybutylene adipate terephthalate (PBAT), Polylactic acid (PLA), and Polyhydroxyalkanoates (PHA). The dataset is partitioned into three subsets: a training dataset comprising uncontaminated conventional and biodegradable plastics, a validation dataset encompassing contaminated plastics of both types, and a testing dataset featuring real-world packaging items in both pristine and contaminated states. Five distinct machine learning algorithms, namely Partial Least Squares Discriminant Analysis (PLS-DA), Support Vector Machine (SVM), Convolutional Neural Network (CNN), Logistic Regression, and Decision Tree Algorithm, were developed and evaluated for their classification performance. Remarkably, the Logistic Regression and CNN model exhibited the most promising outcomes, achieving a perfect accuracy rate of 100% for the training and validation datasets. Notably, the testing dataset yielded an accuracy exceeding 80%. The successful implementation of this sorting technology within recycling and composting facilities holds the potential to significantly elevate recycling and composting rates. As a result, the envisioned circular economy for plastics can be established, thereby offering a viable solution to mitigate plastic pollution.

Keywords: biodegradable plastics, sorting technology, hyperspectral imaging technology, machine learning algorithms

Procedia PDF Downloads 79
569 Prominence of Biopsychosocial Formulation in Health Care Delivery for Aging Population: Empowering Caregiving through Natural Socio-Environmental Approaches

Authors: Kristine Demilou D. Santiago

Abstract:

An access to a high-quality health care system is what sets apart industrialized nations, such as the United States from other developing countries, which in this case is specifically pertaining to their older population. But what was the underrated factor in the sphere of quality healthcare rendered to elderly people in the Western context? Will this salient factor could push conviction to prorogue the existing gaps between self-denial patient-client and cheek by jowl medications? Are the natural socio-environmental approaches of caregiving the protracted remedy to healthcare disparities for aging population considering their day to day living? The conceptual framework of this model is primarily associated with addressing health and illness of human beings considering the biological, psychological and socio-environmental factors around them. The relevance of biopsychosocial formulation advancing each of the characteristics in the Biopsychosocial (BPS) model in a balance contemplation is the tumult of this study in an attempt to respond to prevailing disparities in caregiving services for old-aged patients on a day to day living. Caregiving services have been the medium path connecting between the patient and its prescribed medications. Moreover, caregivers serve as positive reinforcers in a patient’s environment. Therefore, caregivers play an important role in healthcare delivery to patients. They are considered significant people whom their acts will give an impact to a patient’s view in life. This research study intends to present the supreme importance of biopsychosocial assessment to old-aged patients with mental health illness and conditions. Biopsychosocial assessment will secure the quality of full medication to an old-aged adult suffering from a mental illness. This is because it offers a recognizably wholesome approach to medical healing of old-aged adult patients. The principle of biopsychosocial supersedes the biomedicine being offered to old-aged adults having mental illness, but it does not take away the high relevance of scientific biomedicine in healing patients. The framework presented an overlapping participation of each of its factors in its BPS model that affects in general a person’s health. The correlation between the biological (physiological), psychological (mental) and social (environment) in a person’s health condition requires equal attention according to BPS, and it always coexist with each other. Indisputably said, bio-medicine has been and is being in its unceasing endeavor to provide scientifically proven health care medications for every individual seeking medical treatments. As we grow older and eventually reach the other side of the median population, not only our physiological aspects change, our psychological and socio-environmental changes happen too. Caregiving is a salient responsibility taking place on these inevitable changes.

Keywords: biopsychosocial formulation, caregiving through natural approaches, US health care, BPS in caregiving, caregiving for aging population

Procedia PDF Downloads 98
568 Particle Observation in Secondary School Using a Student-Built Instrument: Design-Based Research on a STEM Sequence about Particle Physics

Authors: J.Pozuelo-Muñoz, E. Cascarosa-Salillas, C. Rodríguez-Casals, A. de Echave, E. Terrado-Sieso

Abstract:

This study focuses on the development, implementation, and evaluation of an instructional sequence aimed at 16–17-year-old students, involving the design and use of a cloud chamber—a device that allows observation of subatomic particles. The research addresses the limited presence of particle physics in Spanish secondary and high school curricula, a gap that restricts students' learning of advanced physics concepts and diminishes engagement with complex scientific topics. The primary goal of this project is to introduce particle physics in the classroom through a practical, interdisciplinary methodology that promotes autonomous learning and critical thinking. The methodology is framed within Design-Based Research (DBR), an approach that enables iterative and pragmatic development of educational resources. The research proceeded in several phases, beginning with the design of an experimental teaching sequence, followed by its implementation in high school classrooms. This sequence was evaluated, redesigned, and reimplemented with the aim of enhancing students’ understanding and skills related to designing and using particle detection instruments. The instructional sequence was divided into four stages: introduction to the activity, research and design of cloud chamber prototypes, observation of particle tracks, and analysis of collected data. In the initial stage, students were introduced to the fundamentals of the activity and provided with bibliographic resources to conduct autonomous research on cloud chamber functioning principles. During the design stage, students sourced materials and constructed their own prototypes, stimulating creativity and understanding of physics concepts like thermodynamics and material properties. The third stage focused on observing subatomic particles, where students recorded and analyzed the tracks generated in their chambers. Finally, critical reflection was encouraged regarding the instrument's operation and the nature of the particles observed. The results show that designing the cloud chamber motivates students and actively engages them in the learning process. Additionally, the use of this device introduces advanced scientific topics beyond particle physics, promoting a broader understanding of science. The study’s conclusions emphasize the need to provide students with ample time and space to thoroughly understand the role of materials and physical conditions in the functioning of their prototypes and to encourage critical analysis of the obtained data. This project not only highlights the importance of interdisciplinarity in science education but also provides a practical framework for teachers to adapt complex concepts for educational contexts where these topics are often absent.

Keywords: cloud chamber, particle physics, secondary education, instructional design, design-based research, STEM

Procedia PDF Downloads 13
567 Bacterial Diversity in Vaginal Microbiota in Patients with Different Levels of Cervical Lesions Related to Human Papillomavirus Infection

Authors: Michelle S. Pereira, Analice C. Azevedo, Julliane D. Medeiros, Ana Claudia S. Martins, Didier S. Castellano-Filho, Claudio G. Diniz, Vania L. Silva

Abstract:

Vaginal microbiota is a complex ecosystem, composed by aerobic and anaerobic bacteria, living in a dynamic equilibrium. Lactobacillus spp. are predominant in vaginal ecosystem, and factors such as immunity and hormonal variations may lead to disruptions, resulting in proliferation of opportunistic pathogens. Bacterial vaginosis (BV) is a polymicrobial syndrome, caused by an increasing of anaerobic bacteria replacing Lactobacillus spp. Microorganisms such as Gardnerella vaginalis, Mycoplasma hominis, Mobiluncus spp., and Atopobium vaginae can be found in BV, which may also be associated to other infections such as by Human Papillomavirus (HPV). HPV is highly prevalent in sexually active women, and is considered a risk factor for development of cervical cancer. As long as few data is available on vaginal microbiota of women with HPV-associated cervical lesions, our objectives were to evaluate the diversity in vaginal ecosystem in these women. To all patients, clinical and socio-demographic data were collected after gynecological examination. This study was approved by the Ethics Committee from Federal University of Juiz de Fora, Minas Gerais, Brazil. Vaginal secretion and cervical scraping were collected. Gram-stained smears were evaluated to establish Nugent score for BV determination. Viral and bacterial DNA obtained was used as template for HPV genotyping (PCR) and bacterial fingerprint (REP-PCR). In total 31 patients were included (mean age 35 and 93.6% sexually active). The Nugent score showed that 38.7% were BV. From the medical records, Pap smear tests showed that 32.3% had low grade squamous epithelial lesion (LSIL), 29% had high grade squamous epithelial lesion (HSIL), 25.8% had atypical squamous cells of undetermined significance (ASC-US) and 12.9% with atypical squamous cells that would not exclude high-grade lesion (ASC-H). All participants were HPV+. HPV-16 was the most frequent (87.1%), followed by HPV-18 (61.3%). HPV-31, HPV-52 and HPV-58 were also detected. Coinfection HPV-16/HPV-18 was observed in 75%. In the 18-30 age group, HPV-16 was detected in 40%, and HPV-16/HPV-18 coinfection in 35%. HPV-16 was associated to 30% of ASC-H and 20% of HSIL patients. BV was observed in 50% of HPV-16+ participants and in 45% of HPV-16/HPV-18+. Fingerprints of bacterial communities showed clusters with low similarity suggesting high heterogeneity in vaginal microbiota within the sampled group. Overall, the data is worrisome once cervical-cancer highly risk-associated HPV-types were identified. The high microbial diversity observed may be related to the different levels of cellular lesions, and different physiological conditions of the participants (age, social behavior, education). Further prospective studies are needed to better address correlations and BV and microbial imbalance in vaginal ecosystems which would be related to the different cellular lesions in women with HPV infections. Supported by FAPEMIG, CNPq, CAPES, PPGCBIO/UFJF.

Keywords: human papillomavirus, bacterial vaginosis, bacterial diversity, cervical cancer

Procedia PDF Downloads 195
566 Customer Focus in Digital Economy: Case of Russian Companies

Authors: Maria Evnevich

Abstract:

In modern conditions, in most markets, price competition is becoming less effective. On the one hand, there is a gradual decrease in the level of marginality in main traditional sectors of the economy, so further price reduction becomes too ‘expensive’ for the company. On the other hand, the effect of price reduction is leveled, and the reason for this phenomenon is likely to be informational. As a result, it turns out that even if the company reduces prices, making its products more accessible to the buyer, there is a high probability that this will not lead to increase in sales unless additional large-scale advertising and information campaigns are conducted. Similarly, a large-scale information and advertising campaign have a much greater effect itself than price reductions. At the same time, the cost of mass informing is growing every year, especially when using the main information channels. The article presents generalization, systematization and development of theoretical approaches and best practices in the field of customer focus approach to business management and in the field of relationship marketing in the modern digital economy. The research methodology is based on the synthesis and content-analysis of sociological and marketing research and on the study of the systems of working with consumer appeals and loyalty programs in the 50 largest client-oriented companies in Russia. Also, the analysis of internal documentation on customers’ purchases in one of the largest retail companies in Russia allowed to identify if buyers prefer to buy goods for complex purchases in one retail store with the best price image for them. The cost of attracting a new client is now quite high and continues to grow, so it becomes more important to keep him and increase the involvement through marketing tools. A huge role is played by modern digital technologies used both in advertising (e-mailing, SEO, contextual advertising, banner advertising, SMM, etc.) and in service. To implement the above-described client-oriented omnichannel service, it is necessary to identify the client and work with personal data provided when filling in the loyalty program application form. The analysis of loyalty programs of 50 companies identified the following types of cards: discount cards, bonus cards, mixed cards, coalition loyalty cards, bank loyalty programs, aviation loyalty programs, hybrid loyalty cards, situational loyalty cards. The use of loyalty cards allows not only to stimulate the customer to purchase ‘untargeted’, but also to provide individualized offers, as well as to produce more targeted information. The development of digital technologies and modern means of communication has significantly changed not only the sphere of marketing and promotion, but also the economic landscape as a whole. Factors of competitiveness are the digital opportunities of companies in the field of customer orientation: personalization of service, customization of advertising offers, optimization of marketing activity and improvement of logistics.

Keywords: customer focus, digital economy, loyalty program, relationship marketing

Procedia PDF Downloads 163
565 Temporal Estimation of Hydrodynamic Parameter Variability in Constructed Wetlands

Authors: Mohammad Moezzibadi, Isabelle Charpentier, Adrien Wanko, Robert Mosé

Abstract:

The calibration of hydrodynamic parameters for subsurface constructed wetlands (CWs) is a sensitive process since highly non-linear equations are involved in unsaturated flow modeling. CW systems are engineered systems designed to favour natural treatment processes involving wetland vegetation, soil, and their microbial flora. Their significant efficiency at reducing the ecological impact of urban runoff has been recently proved in the field. Numerical flow modeling in a vertical variably saturated CW is here carried out by implementing the Richards model by means of a mixed hybrid finite element method (MHFEM), particularly well adapted to the simulation of heterogeneous media, and the van Genuchten-Mualem parametrization. For validation purposes, MHFEM results were compared to those of HYDRUS (a software based on a finite element discretization). As van Genuchten-Mualem soil hydrodynamic parameters depend on water content, their estimation is subject to considerable experimental and numerical studies. In particular, the sensitivity analysis performed with respect to the van Genuchten-Mualem parameters reveals a predominant influence of the shape parameters α, n and the saturated conductivity of the filter on the piezometric heads, during saturation and desaturation. Modeling issues arise when the soil reaches oven-dry conditions. A particular attention should also be brought to boundary condition modeling (surface ponding or evaporation) to be able to tackle different sequences of rainfall-runoff events. For proper parameter identification, large field datasets would be needed. As these are usually not available, notably due to the randomness of the storm events, we thus propose a simple, robust and low-cost numerical method for the inverse modeling of the soil hydrodynamic properties. Among the methods, the variational data assimilation technique introduced by Le Dimet and Talagrand is applied. To that end, a variational data assimilation technique is implemented by applying automatic differentiation (AD) to augment computer codes with derivative computations. Note that very little effort is needed to obtain the differentiated code using the on-line Tapenade AD engine. Field data are collected for a three-layered CW located in Strasbourg (Alsace, France) at the water edge of the urban water stream Ostwaldergraben, during several months. Identification experiments are conducted by comparing measured and computed piezometric head by means of the least square objective function. The temporal variability of hydrodynamic parameter is then assessed and analyzed.

Keywords: automatic differentiation, constructed wetland, inverse method, mixed hybrid FEM, sensitivity analysis

Procedia PDF Downloads 163
564 A Greener Approach towards the Synthesis of an Antimalarial Drug Lumefantrine

Authors: Luphumlo Ncanywa, Paul Watts

Abstract:

Malaria is a disease that kills approximately one million people annually. Children and pregnant women in sub-Saharan Africa lost their lives due to malaria. Malaria continues to be one of the major causes of death, especially in poor countries in Africa. Decrease the burden of malaria and save lives is very essential. There is a major concern about malaria parasites being able to develop resistance towards antimalarial drugs. People are still dying due to lack of medicine affordability in less well-off countries in the world. If more people could receive treatment by reducing the cost of drugs, the number of deaths in Africa could be massively reduced. There is a shortage of pharmaceutical manufacturing capability within many of the countries in Africa. However one has to question how Africa would actually manufacture drugs, active pharmaceutical ingredients or medicines developed within these research programs. It is quite likely that such manufacturing would be outsourced overseas, hence increasing the cost of production and potentially limiting the full benefit of the original research. As a result the last few years has seen major interest in developing more effective and cheaper technology for manufacturing generic pharmaceutical products. Micro-reactor technology (MRT) is an emerging technique that enables those working in research and development to rapidly screen reactions utilizing continuous flow, leading to the identification of reaction conditions that are suitable for usage at a production level. This emerging technique will be used to develop antimalarial drugs. It is this system flexibility that has the potential to reduce both the time was taken and risk associated with transferring reaction methodology from research to production. Using an approach referred to as scale-out or numbering up, a reaction is first optimized within the laboratory using a single micro-reactor, and in order to increase production volume, the number of reactors employed is simply increased. The overall aim of this research project is to develop and optimize synthetic process of antimalarial drugs in the continuous processing. This will provide a step change in pharmaceutical manufacturing technology that will increase the availability and affordability of antimalarial drugs on a worldwide scale, with a particular emphasis on Africa in the first instance. The research will determine the best chemistry and technology to define the lowest cost manufacturing route to pharmaceutical products. We are currently developing a method to synthesize Lumefantrine in continuous flow using batch process as bench mark. Lumefantrine is a dichlorobenzylidine derivative effective for the treatment of various types of malaria. Lumefantrine is an antimalarial drug used with artemether for the treatment of uncomplicated malaria. The results obtained when synthesizing Lumefantrine in a batch process are transferred into a continuous flow process in order to develop an even better and reproducible process. Therefore, development of an appropriate synthetic route for Lumefantrine is significant in pharmaceutical industry. Consequently, if better (and cheaper) manufacturing routes to antimalarial drugs could be developed and implemented where needed, it is far more likely to enable antimalarial drugs to be available to those in need.

Keywords: antimalarial, flow, lumefantrine, synthesis

Procedia PDF Downloads 202
563 Nonviolent Communication and Disciplinary Area of Social Communication: Case Study on the International Circulation of Ideas from a Brazilian Perspective

Authors: Luiza Toschi

Abstract:

This work presents part of an empirical and theoretical master's degree meta-research that is interested in the relationship between the disciplinary area of Social Communication, to be investigated with the characteristics of the Bourdieusian scientific field, and the emergence of public interest in Nonviolent Communication (NVC) in Brazil and the world. To this end, the state of the art of this conceptual and practical relationship is investigated based on scientific productions available in spaces of academic credibility, such as conferences and scientific journals renowned in the field. From there, agents and the sociological aspects that make them contribute or not to scientific production in Brazil and the world are mapped. In this work, a brief dive into the international context is presented to understand if and how nonviolent communication permeates scientific production in communication in a systematic way. Using three accessible articles published between 2013 and 2022 in the 117 magazines classified as Quartiles Q1 in the Journal Ranking of Communication, the international production on the subject is compared with the Brazilian one from its context. The social conditions of the international circulation of ideas are thus discussed. Science is a product of its social environment, arising from relations of interest and power that compete in the political dimension at the same time as in the epistemological dimension. In this way, scientific choices are linked to the resources mobilized from or through the prestige and recognition of peers. In this sense, an object of interest stands out to a scientist for its academic value, but also and inseparably that which has a social interest within the collective, their social stratification, and the context of legitimacy created in their surroundings, influenced by cultural universalism. In Brazil, three published articles were found in congresses and journals that mention NVC in their abstract or keywords. All were written by Public Relations undergraduate students. Between the most experienced researchers who guided or validated the publications, it is possible to find two professionals who are interested in the Culture of Peace and Dialogy. Likewise, internationally, only three of the articles found mention the term in their abstract or title. Two analyze journalistic coverage based on the principles of NVC and Journalism for Peace. The third is from one of the Brazilian researchers identified as interested in dialogic practices, who analyses audiovisual material and promotes epistemological reflections. If, on the one hand, some characteristics inside and outside Brazil are similar: small samples, relationship with peace studies, and female researchers, two of whom are Brazilian, on the other hand, differences are obvious. If within the country, the subject is mostly Organizational Communication, outside this intersection, it is not presented explicitly. Furthermore, internationally, there is an interest in analyzing from the perspective of NVC, which has not been found so far in publications in Brazil. Up to the present moment, it is possible to presume that, universally, the legitimacy of the topic is sought by its association with conflict conciliation research and communication for peace.

Keywords: academic field sociology, international circulation of ideas, meta research in communication, nonviolent communication

Procedia PDF Downloads 39
562 Treatment and Diagnostic Imaging Methods of Fetal Heart Function in Radiology

Authors: Mahdi Farajzadeh Ajirlou

Abstract:

Prior evidence of normal cardiac anatomy is desirable to relieve the anxiety of cases with a family history of congenital heart disease or to offer the option of early gestation termination or close follow-up should a cardiac anomaly be proved. Fetal heart discovery plays an important part in the opinion of the fetus, and it can reflect the fetal heart function of the fetus, which is regulated by the central nervous system. Acquisition of ventricular volume and inflow data would be useful to quantify more valve regurgitation and ventricular function to determine the degree of cardiovascular concession in fetal conditions at threat for hydrops fetalis. This study discusses imaging the fetal heart with transvaginal ultrasound, Doppler ultrasound, three-dimensional ultrasound (3DUS) and four-dimensional (4D) ultrasound, spatiotemporal image correlation (STIC), glamorous resonance imaging and cardiac catheterization. Doppler ultrasound (DUS) image is a kind of real- time image with a better imaging effect on blood vessels and soft tissues. DUS imaging can observe the shape of the fetus, but it cannot show whether the fetus is hypoxic or distressed. Spatiotemporal image correlation (STIC) enables the acquisition of a volume of data concomitant with the beating heart. The automated volume accession is made possible by the array in the transducer performing a slow single reach, recording a single 3D data set conforming to numerous 2D frames one behind the other. The volume accession can be done in a stationary 3D, either online 4D (direct volume scan, live 3D ultrasound or a so-called 4D (3D/ 4D)), or either spatiotemporal image correlation-STIC (off-line 4D, which is a circular volume check-up). Fetal cardiovascular MRI would appear to be an ideal approach to the noninvasive disquisition of the impact of abnormal cardiovascular hemodynamics on antenatal brain growth and development. Still, there are practical limitations to the use of conventional MRI for fetal cardiovascular assessment, including the small size and high heart rate of the mortal fetus, the lack of conventional cardiac gating styles to attend data accession, and the implicit corruption of MRI data due to motherly respiration and unpredictable fetal movements. Fetal cardiac MRI has the implicit to complement ultrasound in detecting cardiovascular deformations and extracardiac lesions. Fetal cardiac intervention (FCI), minimally invasive catheter interventions, is a new and evolving fashion that allows for in-utero treatment of a subset of severe forms of congenital heart deficiency. In special cases, it may be possible to modify the natural history of congenital heart disorders. It's entirely possible that future generations will ‘repair’ congenital heart deficiency in utero using nanotechnologies or remote computer-guided micro-robots that work in the cellular layer.

Keywords: fetal, cardiac MRI, ultrasound, 3D, 4D, heart disease, invasive, noninvasive, catheter

Procedia PDF Downloads 38
561 Effects of Culture Conditions on the Adhesion of Yeast Candida spp. and Pichia spp. to Stainless Steel with Different Polishing and Their Control

Authors: Ružica Tomičić, Zorica Tomičić, Peter Raspor

Abstract:

An abundant growth of unwanted yeasts in food processing plants can lead to problems in quality and safety with significant financial losses. Candida and Pichia are the genera mainly involved in spoilage of products in the food and beverage industry. These contaminating microorganisms can form biofilms on food contact surfaces, being difficult to eradicate, increasing the probability of microbial survival and further dissemination during food processing. It is well known that biofilms are more resistant to antimicrobial agents compared to planktonic cells and this makes them difficult to eliminate. Among the strategies used to overcome resistance to antifungal drugs and preservatives, the use of natural substances such as plant extracts has shown particular promise, and many natural substances have been found to exhibit antifungal properties. This study aimed to investigated the impact of growth medium (Malt Extract broth (MEB) or Yeast Peptone Dextrose (YPD) broth) and temperatures (7°C, 37°C, 43°C for Candida strains and 7°C, 27°C, 32°C for Pichia strains) on the adhesion of Candida spp. and Pichia spp. to stainless steel (AISI 304) discs with different degrees of surface roughness (Ra = 25.20 – 961.9 nm), a material commonly used in the food industry. We also evaluated the antifungal and antiadhesion activity of plant extracts such as Humulus lupulus, Alpinia katsumadai and Evodia rutaecarpa against C. albicans, C glabrata and P. membranifaciens and investigated whether these plant extracts can interfere with biofilm formation. The adhesion was assessed by the crystal violet staining method, while the broth microdilution method CLSI M27-A3 was used to determine the minimum inhibitory concentration (MIC) of plant extracts. Our results indicated that the nutrient content of the medium significantly influenced the amount of adhered cells of the tested yeasts. The growth medium which resulted in a higher adhesion of C. albicans and C. glabrata was MEB, while for C. parapsilosis and C. krusei was YPD. In the case of P. pijperi and P. membranifaciens, YPD broth was more effective in promoting adhesion than MEB. Regarding the effect of temperature, C. albicans strain adhered to stainless steel surfaces in significantly higher level at a temperature of 43°C, while on the other hand C. glabrata, C. parapsilosis and C. krusei showed a different behavior with significantly higher adhesion at 37°C than at 7°C and 43°C. Further, the adherence ability of Pichia strains was highest at 27°C. Based on the MIC values, all plant extracts exerted significant antifungal effects with MIC values ranged from 100 to 400 μg/mL. It was observed that biofilm of C. glabrata were more resistance to plant extracts as compared to C. albicans. However, extracts of A. katsumadai and E. rutaecarpa promoted the growth and development of the preformed biofilm of P. membranifaciens. Thus, the knowledge of how these microorganisms adhere and which factors affect this phenomenon is of great importance in order to avoid their colonization on food contact surfaces.

Keywords: adhesion, Candida spp., Pichia spp., plant extracts

Procedia PDF Downloads 194
560 Consensus, Federalism and Inter-State Water Disputes in India

Authors: Amrisha Pandey

Abstract:

Indian constitution has distributed the powers to govern and legislate between the centre and the state governments based on the list of subject-matter provided in the seventh schedule. By that schedule, the states are authorized to regulate the water resource within their territory. However, the centre/union government is authorized to regulate the inter-state water disputes. The powers entrusted to the union government mainly deals with the sharing of river water which flows through the territory of two or more states. For that purpose, a provision enumerated in Article 262 of the Constitution of India which empowers the parliament to resolve any such inter-state river water dispute. Therefore, the parliament has enacted the - ‘Inter-State River Water Dispute Tribunal, Act’, which allows the central/union government to constitute the tribunal for the adjudication of the disputes and expressly bars the jurisdiction of the judiciary in the concerned matter. This arrangement was intended to resolve the dispute using political or diplomatic means, without deliberately interfering with the sovereign power of the states to govern the water resource. The situation in present context is complicated and sensitive. Due to the change in climatic conditions; increasing demand for the limited resource; and the advanced understanding of the freshwater cycle, which is missing from the existing legal regime. The obsolete legal and political tools, the existing legislative mechanism and the institutional units do not seem to accommodate the rising challenge to regulate the resource. Therefore, resulting in the rise of the politicization of the inter-state water disputes. Against this background, this paper will investigate the inter-state river water dispute in India and will critically analyze the ability of the existing constitutional, and institutional units involved in the task. Moreover, the competence of the tribunal as the adjudicating body in present context will be analyzed using the long ongoing inter-state water dispute in India – The Cauvery Water Dispute, as the case study. To conduct the task undertaken in this paper the doctrinal methodology of the research is adopted. The disputes will also be investigated through the lens of sovereignty, which is accorded to the states using the theory of ‘separation of power’ and the ‘grant of internal sovereignty’, to its federal units of governance. The issue of sovereignty in this paper is discussed in two ways: 1) as the responsibility of the state - to govern the resource; and 2) as the obligation of the state - to govern the resource, arising from the sovereign power of the state. Furthermore, the duality of the sovereign power coexists in this analysis; the overall sovereign authority of the nation-state, and the internal sovereignty of the states as its federal units of governance. As a result, this investigation will propose institutional, legislative and judicial reforms. Additionally, it will suggest certain amendments to the existing constitutional provisions in order to avoid the contradictions in their scope and meaning in the light of the advanced hydrological understanding.

Keywords: constitution of India, federalism, inter-state river water dispute tribunal of India, sovereignty

Procedia PDF Downloads 153
559 Nondestructive Monitoring of Atomic Reactions to Detect Precursors of Structural Failure

Authors: Volodymyr Rombakh

Abstract:

This article was written to substantiate the possibility of detecting the precursors of catastrophic destruction of a structure or device and stopping operation before it. Damage to solids results from breaking the bond between atoms, which requires energy. Modern theories of strength and fracture assume that such energy is due to stress. However, in a letter to W. Thomson (Lord Kelvin) dated December 18, 1856, J.C. Maxwell provided evidence that elastic energy cannot destroy solids. He proposed an equation for estimating a deformable body's energy, equal to the sum of two energies. Due to symmetrical compression, the first term does not change, but the second term is distortion without compression. Both types of energy are represented in the equation as a quadratic function of strain, but Maxwell repeatedly wrote that it is not stress but strain. Furthermore, he notes that the nature of the energy causing the distortion is unknown to him. An article devoted to theories of elasticity was published in 1850. Maxwell tried to express mechanical properties with the help of optics, which became possible only after the creation of quantum mechanics. However, Maxwell's work on elasticity is not cited in the theories of strength and fracture. The authors of these theories and their associates are still trying to describe the phenomena they observe based on classical mechanics. The study of Faraday's experiments, Maxwell's and Rutherford's ideas, made it possible to discover a previously unknown area of electromagnetic radiation. The properties of photons emitted in this reaction are fundamentally different from those of photons emitted in nuclear reactions and are caused by the transition of electrons in an atom. The photons released during all processes in the universe, including from plants and organs in natural conditions; their penetrating power in metal is millions of times greater than that of one of the gamma rays. However, they are not non-invasive. This apparent contradiction is because the chaotic motion of protons is accompanied by the chaotic radiation of photons in time and space. Such photons are not coherent. The energy of a solitary photon is insufficient to break the bond between atoms, one of the stages of which is ionization. The photographs registered the rail deformation by 113 cars, while the Gaiger Counter did not. The author's studies show that the cause of damage to a solid is the breakage of bonds between a finite number of atoms due to the stimulated emission of metastable atoms. The guarantee of the reliability of the structure is the ratio of the energy dissipation rate to the energy accumulation rate, but not the strength, which is not a physical parameter since it cannot be measured or calculated. The possibility of continuous control of this ratio is due to the spontaneous emission of photons by metastable atoms. The article presents calculation examples of the destruction of energy and photographs due to the action of photons emitted during the atomic-proton reaction.

Keywords: atomic-proton reaction, precursors of man-made disasters, strain, stress

Procedia PDF Downloads 92
558 European Commission Radioactivity Environmental Monitoring Database REMdb: A Law (Art. 36 Euratom Treaty) Transformed in Environmental Science Opportunities

Authors: M. Marín-Ferrer, M. A. Hernández, T. Tollefsen, S. Vanzo, E. Nweke, P. V. Tognoli, M. De Cort

Abstract:

Under the terms of Article 36 of the Euratom Treaty, European Union Member States (MSs) shall periodically communicate to the European Commission (EC) information on environmental radioactivity levels. Compilations of the information received have been published by the EC as a series of reports beginning in the early 1960s. The environmental radioactivity results received from the MSs have been introduced into the Radioactivity Environmental Monitoring database (REMdb) of the Institute for Transuranium Elements of the EC Joint Research Centre (JRC) sited in Ispra (Italy) as part of its Directorate General for Energy (DG ENER) support programme. The REMdb brings to the scientific community dealing with environmental radioactivity topics endless of research opportunities to exploit the near 200 millions of records received from MSs containing information of radioactivity levels in milk, water, air and mixed diet. The REM action was created shortly after Chernobyl crisis to support the EC in its responsibilities in providing qualified information to the European Parliament and the MSs on the levels of radioactive contamination of the various compartments of the environment (air, water, soil). Hence, the main line of REM’s activities concerns the improvement of procedures for the collection of environmental radioactivity concentrations for routine and emergency conditions, as well as making this information available to the general public. In this way, REM ensures the availability of tools for the inter-communication and access of users from the Member States and the other European countries to this information. Specific attention is given to further integrate the new MSs with the existing information exchange systems and to assist Candidate Countries in fulfilling these obligations in view of their membership of the EU. Article 36 of the EURATOM treaty requires the competent authorities of each MS to provide regularly the environmental radioactivity monitoring data resulting from their Article 35 obligations to the EC in order to keep EC informed on the levels of radioactivity in the environment (air, water, milk and mixed diet) which could affect population. The REMdb has mainly two objectives: to keep a historical record of the radiological accidents for further scientific study, and to collect the environmental radioactivity data gathered through the national environmental monitoring programs of the MSs to prepare the comprehensive annual monitoring reports (MR). The JRC continues his activity of collecting, assembling, analyzing and providing this information to public and MSs even during emergency situations. In addition, there is a growing concern with the general public about the radioactivity levels in the terrestrial and marine environment, as well about the potential risk of future nuclear accidents. To this context, a clear and transparent communication with the public is needed. EURDEP (European Radiological Data Exchange Platform) is both a standard format for radiological data and a network for the exchange of automatic monitoring data. The latest release of the format is version 2.0, which is in use since the beginning of 2002.

Keywords: environmental radioactivity, Euratom, monitoring report, REMdb

Procedia PDF Downloads 443
557 Exposure of Pacu, Piaractus mesopotamicus Gill Tissue to a High Stocking Density: An Ion Regulatory and Microscopy Study

Authors: Wiolene Montanari Nordi, Debora Botequio Moretti, Mariana Caroline Pontin, Jessica Pampolini, Raul Machado-Neto

Abstract:

Gills are organs responsible for respiration and osmoregulation between the fish internal environment and water. Under stress conditions, oxidative response and gill plasticity to attempt to increase gas exchange area are noteworthy, compromising the physiological processes and therefore fish health. Colostrum is a dietary source of nutrients, immunoglobulin, antioxidant and bioactive molecules, essential for immunological protection and development of the gastrointestinal epithelium. The hypothesis of this work is that antioxidant factors present in the colostrum, unprecedentedly tested in gills, can minimize or reduce the alteration of its epithelium structure of juvenile pacu (Piaractus mesopotamicus) subjected to high stocking density. The histological changes in the gills architecture were characterized by the frequency, incidence and severity of the tissue alteration and ionic status. Juvenile (50 kg fish/m3) were fed with pelleted diets containing 0, 10, 20 or 30% of lyophilized bovine colostrum (LBC) inclusion and at 30 experimental days, gill and blood samples were collected in eight fish per treatment. The study revealed differences in the type, frequency and severity (histological alterations index – HAI) of tissue alterations among the treatments, however, no distinct differences in the incidence of alteration (mean alteration value – MAV) were observed. The main histological changes in gill were elevation of the lamellar epithelium, excessive cell proliferation of the filament and lamellar epithelium causing total or partial melting of the lamella, hyperplasia and hypertrophy of lamellar and filament epithelium, uncontrolled thickening of filament and lamellar tissues, mucous and chloride cells presence in the lamella, aneurysms, vascular congestion and presence of parasites. The MAV obtained per treatment were 2.0, 2.5, 1.8 and 2.5 to fish fed diets containing 0, 10, 20 and 30% of LBC inclusion, respectively, classifying the incidence of gill alterations as slightly to moderate. The severity of alteration of individual fish of treatment 0, 10 and 20% LBC ranged values from 5 to 40 (HAI average of 20.1, 17.5 and 17.6, respectively, P > 0.05), and differs from 30% LBC, that ranged from 6 to 129 (HAI mean of 77.2, P < 0.05). The HAI value in the treatments 0, 10 and 20% LBC reveals gill tissue with injuries classified from slightly to moderate, while in 30% LBC moderate to severe, consequence of the onset of necrosis in the tissue of two fish that compromises the normal functioning of the organ. In relation to frequency of gill alterations, evaluated according to absence of alterations (0) to highly frequent (+++), histological alterations were observed in all evaluated fish, with a trend of higher frequency in 0% LBC. The concentration of Na+, Cl-, K+ and Ca2+ did not changed in all treatments (P > 0.05), indicating similar capacity of ion exchange. The concentrations of bovine colostrum used in diets of present study did not impair the alterations observed in the gills of juvenile pacu.

Keywords: histological alterations of gill tissue, ionic status, lyophilized bovine colostrum, optical microscopy

Procedia PDF Downloads 299
556 The Influence of Gender and Sexual Orientation on Police Decisions in Intimate Partner Violence Cases

Authors: Brenda Russell

Abstract:

Police officers spend a great deal of time responding to domestic violence calls. Recent research has found that men and women in heterosexual and same-sex relationships are equally likely to initiate intimate partner violence IPV) and likewise susceptible to victimization, yet police training tends to focus primarily on male perpetration and female victimization. Criminal justice studies have found that male perpetrators of IPV are blamed more than female perpetrators who commit the same offense. While previous research has examined officer’s response in IPV cases with male and female heterosexual offenders, research has yet to investigate police response in same-sex relationships. This study examined officers’ decisions to arrest, perceptions of blame, perceived danger to others, disrespect, and beliefs in prosecution, guilt and sentencing. Officers in the U.S. (N = 248) were recruited using word of mouth and access to police association websites where a link to an online study was made available. Officers were provided with one of 4 experimentally manipulated scenarios depicting a male or female perpetrator (heterosexual or same-sex) in a clear domestic assault situation. Officer age, experience with IPV and IPV training were examined as possible covariates. Training in IPV was not correlated to any dependent variable of interest. Age was correlated with perpetrator arrest and blame (.14 and .16, respectively) and years of experience was correlated to arrest, offering informal advice, and mediating the incident (.14 to -.17). A 2(perpetrator gender) X 2 (victim gender) factorial design was conducted. Results revealed that officers were more likely to provide informal advice and mediate in gay male relationships, and were less likely to arrest perpetrators in same-sex relationships. When officer age and years of experience with domestic violence were statistically controlled, effects for perpetrator arrest and providing informal advice were no longer significant. Officers perceived heterosexual male perpetrators as more dangerous, blameworthy, disrespectful, and believed they would receive significantly longer sentences than all other conditions. When officer age and experience were included as covariates in the analyses perpetrator blame was no longer statistically significant. Age, experience and training in IPV were not related to perceptions of victims. Police perceived victims as more truthful and believable when the perpetrator was a male. Police also believed victims of female perpetrators were more responsible for their own victimization. Victims were more likely to be perceived as a danger to their family when the perpetrator was female. Female perpetrators in same-sex relationships and heterosexual males were considered to experience more mental illness than heterosexual female or gay male perpetrators. These results replicate previous research suggesting male perpetrators are more blameworthy and responsible for their own victimization, yet expands upon previous research by identifying potential biases in police response to IPV in same-sex relationships. This study brings to the forefront the importance of evidence-based officer training in IPV and provides insight into the need for a gender inclusive approach as well as addressing the necessity of the practical applications for police.

Keywords: domestic violence, heterosexual, intimate partner violence, officer response, police officer, same-sex

Procedia PDF Downloads 347
555 Construction of a Dynamic Migration Model of Extracellular Fluid in Brain for Future Integrated Control of Brain State

Authors: Tomohiko Utsuki, Kyoka Sato

Abstract:

In emergency medicine, it is recognized that brain resuscitation is very important for the reduction of mortality rate and neurological sequelae. Especially, the control of brain temperature (BT), intracranial pressure (ICP), and cerebral blood flow (CBF) are most required for stabilizing brain’s physiological state in the treatment for such as brain injury, stroke, and encephalopathy. However, the manual control of BT, ICP, and CBF frequently requires the decision and operation of medical staff, relevant to medication and the setting of therapeutic apparatus. Thus, the integration and the automation of the control of those is very effective for not only improving therapeutic effect but also reducing staff burden and medical cost. For realizing such integration and automation, a mathematical model of brain physiological state is necessary as the controlled object in simulations, because the performance test of a prototype of the control system using patients is not ethically allowed. A model of cerebral blood circulation has already been constructed, which is the most basic part of brain physiological state. Also, a migration model of extracellular fluid in brain has been constructed, however the condition that the total volume of intracranial cavity is almost changeless due to the hardness of cranial bone has not been considered in that model. Therefore, in this research, the dynamic migration model of extracellular fluid in brain was constructed on the consideration of the changelessness of intracranial cavity’s total volume. This model is connectable to the cerebral blood circulation model. The constructed model consists of fourteen compartments, twelve of which corresponds to perfused area of bilateral anterior, middle and posterior cerebral arteries, the others corresponds to cerebral ventricles and subarachnoid space. This model enable to calculate the migration of tissue fluid from capillaries to gray matter and white matter, the flow of tissue fluid between compartments, the production and absorption of cerebrospinal fluid at choroid plexus and arachnoid granulation, and the production of metabolic water. Further, the volume, the colloid concentration, and the tissue pressure of/in each compartment are also calculable by solving 40-dimensional non-linear simultaneous differential equations. In this research, the obtained model was analyzed for its validation under the four condition of a normal adult, an adult with higher cerebral capillary pressure, an adult with lower cerebral capillary pressure, and an adult with lower colloid concentration in cerebral capillary. In the result, calculated fluid flow, tissue volume, colloid concentration, and tissue pressure were all converged to suitable value for the set condition within 60 minutes at a maximum. Also, because these results were not conflict with prior knowledge, it is certain that the model can enough represent physiological state of brain under such limited conditions at least. One of next challenges is to integrate this model and the already constructed cerebral blood circulation model. This modification enable to simulate CBF and ICP more precisely due to calculating the effect of blood pressure change to extracellular fluid migration and that of ICP change to CBF.

Keywords: dynamic model, cerebral extracellular migration, brain resuscitation, automatic control

Procedia PDF Downloads 156
554 Ground Motion Modeling Using the Least Absolute Shrinkage and Selection Operator

Authors: Yildiz Stella Dak, Jale Tezcan

Abstract:

Ground motion models that relate a strong motion parameter of interest to a set of predictive seismological variables describing the earthquake source, the propagation path of the seismic wave, and the local site conditions constitute a critical component of seismic hazard analyses. When a sufficient number of strong motion records are available, ground motion relations are developed using statistical analysis of the recorded ground motion data. In regions lacking a sufficient number of recordings, a synthetic database is developed using stochastic, theoretical or hybrid approaches. Regardless of the manner the database was developed, ground motion relations are developed using regression analysis. Development of a ground motion relation is a challenging process which inevitably requires the modeler to make subjective decisions regarding the inclusion criteria of the recordings, the functional form of the model and the set of seismological variables to be included in the model. Because these decisions are critically important to the validity and the applicability of the model, there is a continuous interest on procedures that will facilitate the development of ground motion models. This paper proposes the use of the Least Absolute Shrinkage and Selection Operator (LASSO) in selecting the set predictive seismological variables to be used in developing a ground motion relation. The LASSO can be described as a penalized regression technique with a built-in capability of variable selection. Similar to the ridge regression, the LASSO is based on the idea of shrinking the regression coefficients to reduce the variance of the model. Unlike ridge regression, where the coefficients are shrunk but never set equal to zero, the LASSO sets some of the coefficients exactly to zero, effectively performing variable selection. Given a set of candidate input variables and the output variable of interest, LASSO allows ranking the input variables in terms of their relative importance, thereby facilitating the selection of the set of variables to be included in the model. Because the risk of overfitting increases as the ratio of the number of predictors to the number of recordings increases, selection of a compact set of variables is important in cases where a small number of recordings are available. In addition, identification of a small set of variables can improve the interpretability of the resulting model, especially when there is a large number of candidate predictors. A practical application of the proposed approach is presented, using more than 600 recordings from the National Geospatial-Intelligence Agency (NGA) database, where the effect of a set of seismological predictors on the 5% damped maximum direction spectral acceleration is investigated. The set of candidate predictors considered are Magnitude, Rrup, Vs30. Using LASSO, the relative importance of the candidate predictors has been ranked. Regression models with increasing levels of complexity were constructed using one, two, three, and four best predictors, and the models’ ability to explain the observed variance in the target variable have been compared. The bias-variance trade-off in the context of model selection is discussed.

Keywords: ground motion modeling, least absolute shrinkage and selection operator, penalized regression, variable selection

Procedia PDF Downloads 330
553 Evaluation of Anti-inflammatory Activities of Extracts Obtained from Capparis Erythrocarpos In-Vivo

Authors: Benedict Ofori, Kwabena Sarpong, Stephen Antwi

Abstract:

Background: Medicinal plants are utilized all around the world and are becoming increasingly important economically. The WHO notes that ‘inappropriate use of traditional medicines or practices can have negative or dangerous effects and that future research is needed to ascertain the efficacy and safety of such practices and medicinal plants used by traditional medicine systems. The poor around the world have limited access to palliative care or pain relief. Pharmacologists have been focused on developing safe and effective anti-inflammatory drugs. Most of the issues related to their use have been linked to the fact that numerous traditional and herbal treatments are classified in different nations as meals or dietary supplements. As a result, there is no need for evidence of the quality, efficacy, or safety of these herbal formulations before they are marketed. The fact that access to drugs meant for pain relief is limited in low-income countries means advanced studies should be done on home drugs meant for inflammation to close the gap. Methods: The ethanolic extracts of the plant were screened for the presence of 10 phytochemicals. The Pierce BCA Protein Assay Kit was used for the determination of the protein concentration of the egg white. The rats were randomly selected and put in 6 groups. The egg white was sub-plantar injected into the right-hand paws of the rats to induce inflammation. The animals were treated with the three plant extracts obtained from the root bark, stem, and leaves of the plant. The control groups were treated with normal saline, while the standard groups were treated with standard drugs indomethacin and celecoxib. Plethysmometer was used to measure the change in paw volume of the animals over the course of the experiment. Results: The results of the phytochemical screening revealed the presence of reducing sugars and saponins. Alkaloids were present in only R.L.S (1:1:1), and phytosterols were found in R.L(1:1) and R.L.S (1:1:1). The estimated protein concentration was found to be 103.75 mg/ml. The control group had an observable increase in paw volume, which indicated that inflammation was induced during the 5 hours. The increase in paw volume for the control group peaked at the 1st hour and decreased gradually throughout the experiment, with minimal changes in the paw volumes. The 2nd and 3rd groups were treated with 20 mg/kg of indomethacin and celecoxib. The anti-inflammatory activities of indomethacin and celecoxib were calculated to be 21.4% and 4.28%, respectively. The remaining 3 groups were treated with 2 dose levels of 200mg/kg plant extracts. R.L.S, R.L, and S.R.L had anti-inflammatory activities of 22.3%, 8.2%, and 12.07%, respectively. Conclusions: Egg albumin-induced paw model in rats can be used to evaluate the anti-inflammatory activity of herbs that might have potential anti-inflammatory activity. Herbal medications have potential anti-inflammatory activities and can be used to manage various inflammatory conditions if their efficacy and side effects are well studied. The three extracts all possessed anti-inflammatory activity, with R.L.S having the highest anti-inflammatory activity.

Keywords: inflammation, capparis erythrocarpos, anti-inflammatory activity, herbal medicine, paw volume, egg albumin

Procedia PDF Downloads 89
552 Theorizing Optimal Use of Numbers and Anecdotes: The Science of Storytelling in Newsrooms

Authors: Hai L. Tran

Abstract:

When covering events and issues, the news media often employ both personal accounts as well as facts and figures. However, the process of using numbers and narratives in the newsroom is mostly operated through trial and error. There is a demonstrated need for the news industry to better understand the specific effects of storytelling and data-driven reporting on the audience as well as explanatory factors driving such effects. In the academic world, anecdotal evidence and statistical evidence have been studied in a mutually exclusive manner. Existing research tends to treat pertinent effects as though the use of one form precludes the other and as if a tradeoff is required. Meanwhile, narratives and statistical facts are often combined in various communication contexts, especially in news presentations. There is value in reconceptualizing and theorizing about both relative and collective impacts of numbers and narratives as well as the mechanism underlying such effects. The current undertaking seeks to link theory to practice by providing a complete picture of how and why people are influenced by information conveyed through quantitative and qualitative accounts. Specifically, the cognitive-experiential theory is invoked to argue that humans employ two distinct systems to process information. The rational system requires the processing of logical evidence effortful analytical cognitions, which are affect-free. Meanwhile, the experiential system is intuitive, rapid, automatic, and holistic, thereby demanding minimum cognitive resources and relating to the experience of affect. In certain situations, one system might dominate the other, but rational and experiential modes of processing operations in parallel and at the same time. As such, anecdotes and quantified facts impact audience response differently and a combination of data and narratives is more effective than either form of evidence. In addition, the present study identifies several media variables and human factors driving the effects of statistics and anecdotes. An integrative model is proposed to explain how message characteristics (modality, vividness, salience, congruency, position) and individual differences (involvement, numeracy skills, cognitive resources, cultural orientation) impact selective exposure, which in turn activates pertinent modes of processing, and thereby induces corresponding responses. The present study represents a step toward bridging theoretical frameworks from various disciplines to better understand the specific effects and the conditions under which the use of anecdotal evidence and/or statistical evidence enhances or undermines information processing. In addition to theoretical contributions, this research helps inform news professionals about the benefits and pitfalls of incorporating quantitative and qualitative accounts in reporting. It proposes a typology of possible scenarios and appropriate strategies for journalists to use when presenting news with anecdotes and numbers.

Keywords: data, narrative, number, anecdote, storytelling, news

Procedia PDF Downloads 79
551 Synthesis of Carbon Nanotubes from Coconut Oil and Fabrication of a Non Enzymatic Cholesterol Biosensor

Authors: Mitali Saha, Soma Das

Abstract:

The fabrication of nanoscale materials for use in chemical sensing, biosensing and biological analyses has proven a promising avenue in the last few years. Cholesterol has aroused considerable interest in recent years on account of its being an important parameter in clinical diagnosis. There is a strong positive correlation between high serum cholesterol level and arteriosclerosis, hypertension, and myocardial infarction. Enzyme-based electrochemical biosensors have shown high selectivity and excellent sensitivity, but the enzyme is easily denatured during its immobilization procedure and its activity is also affected by temperature, pH, and toxic chemicals. Besides, the reproducibility of enzyme-based sensors is not very good which further restrict the application of cholesterol biosensor. It has been demonstrated that carbon nanotubes could promote electron transfer with various redox active proteins, ranging from cytochrome c to glucose oxidase with a deeply embedded redox center. In continuation of our earlier work on the synthesis and applications of carbon and metal based nanoparticles, we have reported here the synthesis of carbon nanotubes (CCNT) by burning coconut oil under insufficient flow of air using an oil lamp. The soot was collected from the top portion of the flame, where the temperature was around 6500C which was purified, functionalized and then characterized by SEM, p-XRD and Raman spectroscopy. The SEM micrographs showed the formation of tubular structure of CCNT having diameter below 100 nm. The XRD pattern indicated the presence of two predominant peaks at 25.20 and 43.80, which corresponded to (002) and (100) planes of CCNT respectively. The Raman spectrum (514 nm excitation) showed the presence of 1600 cm-1 (G-band) related to the vibration of sp2-bonded carbon and at 1350 cm-1 (D-band) responsible for the vibrations of sp3-bonded carbon. A nonenzymatic cholesterol biosensor was then fabricated on an insulating Teflon material containing three silver wires at the surface, covered by CCNT, obtained from coconut oil. Here, CCNTs worked as working as well as counter electrodes whereas reference electrode and electric contacts were made of silver. The dimensions of the electrode was 3.5 cm×1.0 cm×0.5 cm (length× width × height) and it is ideal for working with 50 µL volume like the standard screen printed electrodes. The voltammetric behavior of cholesterol at CCNT electrode was investigated by cyclic voltammeter and differential pulse voltammeter using 0.001 M H2SO4 as electrolyte. The influence of the experimental parameters on the peak currents of cholesterol like pH, accumulation time, and scan rates were optimized. Under optimum conditions, the peak current was found to be linear in the cholesterol concentration range from 1 µM to 50 µM with a sensitivity of ~15.31 μAμM−1cm−2 with lower detection limit of 0.017 µM and response time of about 6s. The long-term storage stability of the sensor was tested for 30 days and the current response was found to be ~85% of its initial response after 30 days.

Keywords: coconut oil, CCNT, cholesterol, biosensor

Procedia PDF Downloads 282
550 Rheological Characterization of Polysaccharide Extracted from Camelina Meal as a New Source of Thickening Agent

Authors: Mohammad Anvari, Helen S. Joyner (Melito)

Abstract:

Camelina sativa (L.) Crantz is an oilseed crop currently used for the production of biofuels. However, the low price of diesel and gasoline has made camelina an unprofitable crop for farmers, leading to declining camelina production in the US. Hence, the ability to utilize camelina byproduct (defatted meal) after oil extraction would be a pivotal factor for promoting the economic value of the plant. Camelina defatted meal is rich in proteins and polysaccharides. The great diversity in the polysaccharide structural features provides a unique opportunity for use in food formulations as thickeners, gelling agents, emulsifiers, and stabilizers. There is currently a great degree of interest in the study of novel plant polysaccharides, as they can be derived from readily accessible sources and have potential application in a wide range of food formulations. However, there are no published studies on the polysaccharide extracted from camelina meal, and its potential industrial applications remain largely underexploited. Rheological properties are a key functional feature of polysaccharides and are highly dependent on the material composition and molecular structure. Therefore, the objective of this study was to evaluate the rheological properties of the polysaccharide extracted from camelina meal at different conditions to obtain insight on the molecular characteristics of the polysaccharide. Flow and dynamic mechanical behaviors were determined under different temperatures (5-50°C) and concentrations (1-6% w/v). Additionally, the zeta potential of the polysaccharide dispersion was measured at different pHs (2-11) and a biopolymer concentration of 0.05% (w/v). Shear rate sweep data revealed that the camelina polysaccharide displayed shear thinning (pseudoplastic) behavior, which is typical of polymer systems. The polysaccharide dispersion (1% w/v) showed no significant changes in viscosity with temperature, which makes it a promising ingredient in products requiring texture stability over a range of temperatures. However, the viscosity increased significantly with increased concentration, indicating that camelina polysaccharide can be used in food products at different concentrations to produce a range of textures. Dynamic mechanical spectra showed similar trends. The temperature had little effect on viscoelastic moduli. However, moduli were strongly affected by concentration: samples exhibited concentrated solution behavior at low concentrations (1-2% w/v) and weak gel behavior at higher concentrations (4-6% w/v). These rheological properties can be used for designing and modeling of liquid and semisolid products. Zeta potential affects the intensity of molecular interactions and molecular conformation and can alter solubility, stability, and eventually, the functionality of the materials as their environment changes. In this study, the zeta potential value significantly decreased from 0.0 to -62.5 as pH increased from 2 to 11, indicating that pH may affect the functional properties of the polysaccharide. The results obtained in the current study showed that camelina polysaccharide has significant potential for application in various food systems and can be introduced as a novel anionic thickening agent with unique properties.

Keywords: Camelina meal, polysaccharide, rheology, zeta potential

Procedia PDF Downloads 245
549 Simulation of Hydraulic Fracturing Fluid Cleanup for Partially Degraded Fracturing Fluids in Unconventional Gas Reservoirs

Authors: Regina A. Tayong, Reza Barati

Abstract:

A stable, fast and robust three-phase, 2D IMPES simulator has been developed for assessing the influence of; breaker concentration on yield stress of filter cake and broken gel viscosity, varying polymer concentration/yield stress along the fracture face, fracture conductivity, fracture length, capillary pressure changes and formation damage on fracturing fluid cleanup in tight gas reservoirs. This model has been validated as against field data reported in the literature for the same reservoir. A 2-D, two-phase (gas/water) fracture propagation model is used to model our invasion zone and create the initial conditions for our clean-up model by distributing 200 bbls of water around the fracture. A 2-D, three-phase IMPES simulator, incorporating a yield-power-law-rheology has been developed in MATLAB to characterize fluid flow through a hydraulically fractured grid. The variation in polymer concentration along the fracture is computed from a material balance equation relating the initial polymer concentration to total volume of injected fluid and fracture volume. All governing equations and the methods employed have been adequately reported to permit easy replication of results. The effect of increasing capillary pressure in the formation simulated in this study resulted in a 10.4% decrease in cumulative production after 100 days of fluid recovery. Increasing the breaker concentration from 5-15 gal/Mgal on the yield stress and fluid viscosity of a 200 lb/Mgal guar fluid resulted in a 10.83% increase in cumulative gas production. For tight gas formations (k=0.05 md), fluid recovery increases with increasing shut-in time, increasing fracture conductivity and fracture length, irrespective of the yield stress of the fracturing fluid. Mechanical induced formation damage combined with hydraulic damage tends to be the most significant. Several correlations have been developed relating pressure distribution and polymer concentration to distance along the fracture face and average polymer concentration variation with injection time. The gradient in yield stress distribution along the fracture face becomes steeper with increasing polymer concentration. The rate at which the yield stress (τ_o) is increasing is found to be proportional to the square of the volume of fluid lost to the formation. Finally, an improvement on previous results was achieved through simulating yield stress variation along the fracture face rather than assuming constant values because fluid loss to the formation and the polymer concentration distribution along the fracture face decreases as we move away from the injection well. The novelty of this three-phase flow model lies in its ability to (i) Simulate yield stress variation with fluid loss volume along the fracture face for different initial guar concentrations. (ii) Simulate increasing breaker activity on yield stress and broken gel viscosity and the effect of (i) and (ii) on cumulative gas production within reasonable computational time.

Keywords: formation damage, hydraulic fracturing, polymer cleanup, multiphase flow numerical simulation

Procedia PDF Downloads 130
548 The Effect of Soil-Structure Interaction on the Post-Earthquake Fire Performance of Structures

Authors: A. T. Al-Isawi, P. E. F. Collins

Abstract:

The behaviour of structures exposed to fire after an earthquake is not a new area of engineering research, but there remain a number of areas where further work is required. Such areas relate to the way in which seismic excitation is applied to a structure, taking into account the effect of soil-structure interaction (SSI) and the method of analysis, in addition to identifying the excitation load properties. The selection of earthquake data input for use in nonlinear analysis and the method of analysis are still challenging issues. Thus, realistic artificial ground motion input data must be developed to certify that site properties parameters adequately describe the effects of the nonlinear inelastic behaviour of the system and that the characteristics of these parameters are coherent with the characteristics of the target parameters. Conversely, ignoring the significance of some attributes, such as frequency content, soil site properties and earthquake parameters may lead to misleading results, due to the misinterpretation of required input data and the incorrect synthesise of analysis hypothesis. This paper presents a study of the post-earthquake fire (PEF) performance of a multi-storey steel-framed building resting on soft clay, taking into account the effects of the nonlinear inelastic behaviour of the structure and soil, and the soil-structure interaction (SSI). Structures subjected to an earthquake may experience various levels of damage; the geometrical damage, which indicates the change in the initial structure’s geometry due to the residual deformation as a result of plastic behaviour, and the mechanical damage which identifies the degradation of the mechanical properties of the structural elements involved in the plastic range of deformation. Consequently, the structure presumably experiences partial structural damage but is then exposed to fire under its new residual material properties, which may result in building failure caused by a decrease in fire resistance. This scenario would be more complicated if SSI was also considered. Indeed, most earthquake design codes ignore the probability of PEF as well as the effect that SSI has on the behaviour of structures, in order to simplify the analysis procedure. Therefore, the design of structures based on existing codes which neglect the importance of PEF and SSI can create a significant risk of structural failure. In order to examine the criteria for the behaviour of a structure under PEF conditions, a two-dimensional nonlinear elasto-plastic model is developed using ABAQUS software; the effects of SSI are included. Both geometrical and mechanical damages have been taken into account after the earthquake analysis step. For comparison, an identical model is also created, which does not include the effects of soil-structure interaction. It is shown that damage to structural elements is underestimated if SSI is not included in the analysis, and the maximum percentage reduction in fire resistance is detected in the case when SSI is included in the scenario. The results are validated using the literature.

Keywords: Abaqus Software, Finite Element Analysis, post-earthquake fire, seismic analysis, soil-structure interaction

Procedia PDF Downloads 121
547 Methodology for Risk Assessment of Nitrosamine Drug Substance Related Impurities in Glipizide Antidiabetic Formulations

Authors: Ravisinh Solanki, Ravi Patel, Chhaganbhai Patel

Abstract:

Purpose: The purpose of this study is to develop a methodology for the risk assessment and evaluation of nitrosamine impurities in Glipizide antidiabetic formulations. Nitroso compounds, including nitrosamines, have emerged as significant concerns in drug products, as highlighted by the ICH M7 guidelines. This study aims to identify known and potential sources of nitrosamine impurities that may contaminate Glipizide formulations and assess their presence. By determining observed or predicted levels of these impurities and comparing them with regulatory guidance, this research will contribute to ensuring the safety and quality of combination antidiabetic drug products on the market. Factors contributing to the presence of genotoxic nitrosamine contaminants in glipizide medications, such as secondary and tertiary amines, and nitroso group-complex forming molecules, will be investigated. Additionally, conditions necessary for nitrosamine formation, including the presence of nitrosating agents, and acidic environments, will be examined to enhance understanding and mitigation strategies. Method: The methodology for the study involves the implementation of the N-Nitroso Acid Precursor (NAP) test, as recommended by the WHO in 1978 and detailed in the 1980 International Agency for Research on Cancer monograph. Individual glass vials containing equivalent to 10mM quantities of Glipizide is prepared. These compounds are dissolved in an acidic environment and supplemented with 40 mM NaNO2. The resulting solutions are maintained at a temperature of 37°C for a duration of 4 hours. For the analysis of the samples, an HPLC method is employed for fit-for-purpose separation. LC resolution is achieved using a step gradient on an Agilent Eclipse Plus C18 column (4.6 X 100 mm, 3.5µ). Mobile phases A and B consist of 0.1% v/v formic acid in water and acetonitrile, respectively, following a gradient mode program. The flow rate is set at 0.6 mL/min, and the column compartment temperature is maintained at 35°C. Detection is performed using a PDA detector within the wavelength range of 190-400 nm. To determine the exact mass of formed nitrosamine drug substance related impurities (NDSRIs), the HPLC method is transferred to LC-TQ-MS/MS with the same mobile phase composition and gradient program. The injection volume is set at 5 µL, and MS analysis is conducted in Electrospray Ionization (ESI) mode within the mass range of 100−1000 Daltons. Results: The samples of NAP test were prepared according to the protocol. The samples were analyzed using HPLC and LC-TQ-MS/MS identify possible NDSRIs generated in different formulations of glipizide. It was found that the NAP test generated a various NDSRIs. The new finding, which has not been reported yet, discovered contamination of Glipizide. These NDSRIs are categorised based on the predicted carcinogenic potency and recommended its acceptable intact in medicines. The analytical method was found specific and reproducible.

Keywords: NDSRI, nitrosamine impurities, antidiabetic, glipizide, LC-MS/MS

Procedia PDF Downloads 32
546 Need for Elucidation of Palaeoclimatic Variability in the High Himalayan Mountains: A Multiproxy Approach

Authors: Sheikh Nawaz Ali, Pratima Pandey, P. Morthekai, Jyotsna Dubey, Md. Firoze Quamar

Abstract:

The high mountain glaciers are one of the most sensitive recorders of climate changes, because they have the tendency to respond to the combined effect of snow fall and temperature. The Himalayan glaciers have been studied with a good pace during the last decade. However, owing to its large ecological diversity and geographical vividness, major part of the Indian Himalaya is uninvestigated, and hence the palaeoclimatic patterns as well as the chronology of past glaciations in particular remain controversial for the entire Indian Himalayan transect. Although the Himalayan glaciers are nourished by two important climatic systems viz. the southwest summer monsoon and the mid-latitude westerlies, however, the influence of these systems is yet to be understood. Nevertheless, existing chronology (mostly exposure ages) indicate that irrespective of the geographical position, glaciers seem to grow during enhanced Indian summer monsoon (ISM). The Himalayan mountain glaciers are referred to the third pole or water tower of Asia as they form a huge reservoir of the fresh water supplies for the Asian countries. Mountain glaciers are sensitive probes of the local climate, and, thus, they present an opportunity and a challenge to interpret climates of the past as well as to predict future changes. The principle object of all the palaeoclimatic studies is to develop a futuristic models/scenario. However, it has been found that the glacial chronologies bracket the major phases of climatic events only, and other climatic proxies are sparse in Himalaya. This is the reason that compilation of data for rapid climatic change during the Holocene shows major gaps in this region. The sedimentation in proglacial lakes, conversely, is more continuous and, hence, can be used to reconstruct a more complete record of past climatic variability that is modulated by changing ice volume of the valley glacier. The Himalayan region has numerous proglacial lacustrine deposits formed during the late Quaternary period. However, there are only few such deposits which have been studied so far. Therefore, this is the high time when efforts have to be made to systematically map the moraines located in different climatic zones, reconstruct the local and regional moraine stratigraphy and use multiple dating techniques to bracket the events of glaciation. Besides this, emphasis must be given on carrying multiproxy studies on the lacustrine sediments that will provide a high resolution palaeoclimatic data from the alpine region of the Himalaya. Although the Himalayan glaciers fluctuated in accordance with the changing climatic conditions (natural forcing), however, it is too early to arrive at any conclusion. It is very crucial to generate multiproxy data sets covering wider geographical and ecological domains taking into consideration multiple parameters that directly or indirectly influence the glacier mass balance as well as the local climate of a region.

Keywords: glacial chronology, palaeoclimate, multiproxy, Himalaya

Procedia PDF Downloads 263
545 Impact of Ecosystem Engineers on Soil Structuration in a Restored Floodplain in Switzerland

Authors: Andreas Schomburg, Claire Le Bayon, Claire Guenat, Philip Brunner

Abstract:

Numerous river restoration projects have been established in Switzerland in recent years after decades of human activity in floodplains. The success of restoration projects in terms of biodiversity and ecosystem functions largely depend on the development of the floodplain soil system. Plants and earthworms as ecosystem engineers are known to be able to build up a stable soil structure by incorporating soil organic matter into the soil matrix that creates water stable soil aggregates. Their engineering efficiency however largely depends on changing soil properties and frequent floods along an evolutive floodplain transect. This study, therefore, aims to quantify the effect of flood frequency and duration as well as of physico-chemical soil parameters on plants’ and earthworms’ engineering efficiency. It is furthermore predicted that these influences may have a different impact on one of the engineers that leads to a varying contribution to aggregate formation within the floodplain transect. Ecosystem engineers were sampled and described in three different floodplain habitats differentiated according to the evolutionary stages of the vegetation ranging from pioneer to forest vegetation in a floodplain restored 15 years ago. In addition, the same analyses were performed in an embanked adjacent pasture as a reference for the pre-restored state. Soil aggregates were collected and analyzed for their organic matter quantity and quality using Rock Eval pyrolysis. Water level and discharge measurements dating back until 2008 were used to quantify the return period of major floods. Our results show an increasing amount of water stable aggregates in soil with increasing distance to the river and show largest values in the reference site. A decreasing flood frequency and the proportion of silt and clay in the soil texture explain these findings according to F values from one way ANOVA of a fitted mixed effect model. Significantly larger amounts of labile organic matter signatures were found in soil aggregates in the forest habitat and in the reference site that indicates a larger contribution of plants to soil aggregation in these habitats compared to the pioneer vegetation zone. Earthworms’ contribution to soil aggregation does not show significant differences in the floodplain transect, but their effect could be identified even in the pioneer vegetation with its large proportion of coarse sand in the soil texture and frequent inundations. These findings indicate that ecosystem engineers seem to be able to create soil aggregates even under unfavorable soil conditions and under frequent floods. A restoration success can therefore be expected even in ecosystems with harsh soil properties and frequent external disturbances.

Keywords: ecosystem engineers, flood frequency, floodplains, river restoration, rock eval pyrolysis, soil organic matter incorporation, soil structuration

Procedia PDF Downloads 269
544 The Link between Strategic Sense-Making and Performance in Dubai Public Sector

Authors: Mohammad Rahman, Guy Burton, Megan Mathias

Abstract:

Strategic management as an organizational practice was adopted by the public sector in the New Public Management (NPM) era that began in most parts of the world in the 1980s. Strategy as a new public management concept was subscribed by governments in both developed and developing world, as they were persuaded that clearly defined vision, mission and goals, as well as programs and projects - aligned with the goals - could potentially help achieve government vision at the national level and organizational goals at the service-delivery level. The advocates for strategic management in the public sector saw an inherent link between strategy and performance, claiming that the implementation of organizational strategy has an effect on the overall performance of an organization. Arguably, many government entities that have failed in enhancing team and individual performance had poorly-designed strategy or weak strategy implementation. Another key argument about low-level performance is linked with lack of strategic sense-making and orientation by middle managers in particular. Scholars maintain that employees at all levels need to understand strategic management plan in order to facilitate its implementation. Therefore, involving employees (particularly the middle managers) from the beginning potentially helps an organization avoid the drop in performance, and on the contrary would increase their commitment. The United Arab Emirates (UAE) is well known for adopting public sector reform strategies and tools since the 1990s. This observation is contextually pertinent in the case of the Government of Dubai, which has provided a Strategy Execution Guide to all of its entities to achieve high level strategic success in service delivery. The Dubai public sector also adopts road maps for e-Government, Smart Dubai, Expo 2020, investment, environment, education, health and other sectors. Evidently, some of these strategies are bringing tangible (e.g. Smart Dubai transformation) results in a transformational manner. However, the amount of academic research and literature on the strategy process vis-à-vis staff performance in the Government of Dubai is limited. In this backdrop, this study examines how individual performance of public sector employees in Dubai is linked with their sense-making, engagement and orientation with strategy development and implementation processes. Based on a theoretical framework, this study will undertake a sample-based questionnaire survey amongst middle managers in Dubai public sector to (a) measure the level of engagement of middle managers in strategy development and implementation processes as perceived by them; (b) observe the organizational landscape in which role expectations are placed on middle managers; and (c) examine the impact of employee engagement in strategy development process and the conditions for role expectations on individual performance. The paper is expected to provide new insights on the interface between strategic sense-making and performance in order to contribute a better understanding of the current culture/practices of staff engagement in strategic management in the public sector of Dubai.

Keywords: employee performance, government of Dubai, middle managers, strategic sense-making

Procedia PDF Downloads 197