Search results for: global market share
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8940

Search results for: global market share

1200 Using ICESat-2 Dynamic Ocean Topography to Estimate Western Arctic Freshwater Content

Authors: Joshua Adan Valdez, Shawn Gallaher

Abstract:

Global climate change has impacted atmospheric temperatures contributing to rising sea levels, decreasing sea ice, and increased freshening of high latitude oceans. This freshening has contributed to increased stratification inhibiting local mixing and nutrient transport, modifying regional circulations in polar oceans. In recent years, the Western Arctic has seen an increase in freshwater volume at an average rate of 397+-116km3/year across the Beaufort Gyre. The majority of the freshwater volume resides in the Beaufort Gyre surface lens driven by anticyclonic wind forcing, sea ice melt, and Arctic river runoff, and is typically defined as water fresher than 34.8. The near-isothermal nature of Arctic seawater and non-linearities in the equation of state for near-freezing waters result in a salinity-driven pycnocline as opposed to the temperature-driven density structure seen in the lower latitudes. In this study, we investigate the relationship between freshwater content and dynamic ocean topography (DOT). In situ measurements of freshwater content are useful in providing information on the freshening rate of the Beaufort Gyre; however, their collection is costly and time-consuming. Utilizing NASA’s ICESat-2’s DOT remote sensing capabilities and Air Expendable CTD (AXCTD) data from the Seasonal Ice Zone Reconnaissance Surveys (SIZRS), a linear regression model between DOT and freshwater content is determined along the 150° west meridian. Freshwater content is calculated by integrating the volume of water between the surface and a depth with a reference salinity of ~34.8. Using this model, we compare interannual variability in freshwater content within the gyre, which could provide a future predictive capability of freshwater volume changes in the Beaufort-Chukchi Sea using non-in situ methods. Successful employment of the ICESat-2’s DOT approximation of freshwater content could potentially demonstrate the value of remote sensing tools to reduce reliance on field deployment platforms to characterize physical ocean properties.

Keywords: Cryosphere, remote sensing, Arctic oceanography, climate modeling, Ekman transport

Procedia PDF Downloads 64
1199 Translation of Scientific and Technological Terms into Hausa Language: A Guide to Hausa Language Translator in an Electronic Media (Radio)

Authors: Surajo Ladan

Abstract:

There is no doubt nowadays, the media plays a crucial role in the development of languages. Media practitioners influence and set our linguistic norms to a greater extent. Their strategic position makes them influential than school teachers as linguistic pacesetters and models. This is so because of the direct access to the general public that media enjoys being public, oriented and at the same time being patronized by the public, the media is regarded as an authority as far as language use is concerned. In the modern world, listening to the news has become part and parcel of our daily lives. Easy communication has made the world a global village. Contact between countries and people are increasing daily. In Nigeria and indeed the whole of West Africa, radio is the most widely spread out of the three types of media (radio, television, and print). This is because of its (radio) cheapness and less cumbersome and flexibility. Therefore, the positive or negative effect of radio on the lives of a typical Nigerian or African cannot be over emphasized. Hausa language, on the other hand, is one of the most widely spoken languages in West Africa and, of course, the lingua franca in the Northern part of Nigeria and Southern Niger. The language has been in use to a large extent by almost all the popular foreign media houses of BBC, VOA, Deutsche Welle Radio, Radio France International, Radio China, etc. The many people in Nigeria and West Africa depend so much on the news in this language. In fact even government programmes, mobilization, education and sensitization of the populace are done in this language through the broadcast media. It is against this background, for effective and efficient work of this nature it requires the services of a trained translator for the purpose of translating scientific and technological terms. The main thrust of this paper was necessitated for the fact that no nation develops using foreign or borrowed language. This is in lined with UNESCO declaration of 1953 where it says 'the best Language of Instruction (LOI) is the vernacular or the Mother Tongue (MT) of the learner'. This idea is in the right direction especially nowadays that the developing nations have come to terms with realities that their destiny is really in their own hands, not in the hands of the so-called developed nations.

Keywords: translation, scientific, technological, language, radio, media

Procedia PDF Downloads 358
1198 Global Capitalism and Commodification of Breastfeeding: An Investigation of Its Impact on the “Traditional” African Conception of Family Life and Motherhood

Authors: Mosito Jonas Seabela

Abstract:

Breastfeeding in public has become a contentious issue in contemporary society. Mothers are often subjected to unfair discrimination and harassment for simply responding to their maternal instinct to breastfeed their infants. The unwillingness of society to accept public breastfeeding as a natural, non-sexual act is partly influenced by the imposition of a pornified and hypersexualised Western culture, which was imported to Africa through colonisation, enforced by the apartheid regime, and is now perpetuated by Western media. The imposition of the modern nuclear family on Africans, and the coerced aspiration to subscribe to bourgeois values, has eroded the moral standing of the traditional African family and its cultural values. Western-centric perceptions of African women have altered the experience of motherhood for many, commodifying the practice of breastfeeding. As a result, the use of bottles and infant formulas is often perceived as the preferred method, while breastfeeding in public is viewed as primitive, immoral, and unacceptable. This normative study seeks to answer the question of what ought to be done to preserve the dignity of African motherhood and protect their right to breastfeed in public. The African philosophy of Ubuntu is employed to advocate for the right to breastfeed in public. This moral philosophy posits that the western perception of a person seeks to isolate people from their environment and culture, thereby undermining the process of acquiring humanity, which fosters social cohesion. The Ubuntu philosophy embodies the aphorism, “umuntu ngumuntu nga bantu”, meaning “a person is a person through other persons”, signifying people’s interconnectedness and interdependence. The application of the key principles of Ubuntu, such as “survival, the spirit of solidarity, compassion, respect, and dignity” can improve human interaction and unite the public to support the government’s efforts to increase exclusive breastfeeding rates and reduce infant mortality rates. A doctrine called “Ubuntu Lactivism” is what the author proposes as a means to advocate for breastfeeding rights in fulfilment of African traditional values.

Keywords: ubuntu, breastfeeding, Afrocentric, colonization, culture, motherhood, imperialism, objectification

Procedia PDF Downloads 61
1197 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide

Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva

Abstract:

Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.

Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning

Procedia PDF Downloads 150
1196 Characterization of Practices among Pig Smallholders in Cambodia and Implications for Disease Risk

Authors: Phalla Miech, William Leung, Ty Chhay, Sina Vor, Arata Hidano

Abstract:

Smallholder pig farms (SPFs) are prevalent in Cambodia but are vulnerable to disease impacts, as evidenced by the recent incursion of African swine fever into the region. As part of the ‘PigFluCam+’ project, we sought to provide an updated picture of pig husbandry and biosecurity practices among SPFs in south-central Cambodia. A multi-stage sampling design was adopted to select study districts and villages within four provinces: Phnom Penh, Kandal, Takeo, and Kampong Speu. Structured interviews were conductedbetween October 2020 - May 2021 among all consenting households keeping pigs in 16 target villages. Recruited SPFs (n=176) kept 6.8 pigs on average (s.d.=7.7), with most (88%) keeping cross-bred varieties of sows (77%), growers/finishers (39%), piglets/weaners (22%), and few keeping boars (5%). Chickens (83%) and waterfowl (56%) were commonly raised and could usually contact pigs directly (79%). Pigs were the primary source of household income for 28% of participants. While pigs tended to be housed individually (40%) or in groups (33%), 13% kept pigs free-ranging/tethered. Pigs were commonly fed agricultural by-products (80%), commercial feed (60%), and, notably, household waste (59%). Under half of SPFs vaccinated their pigs (e.g., against classical swine fever, Aujesky’s, and pasteurellosis, although the target disease was often unknown). Among 20 SPFs who experienced pig morbidities/mortalities within the past 6 months, only 3 (15%) reported to animal health workers, and disease etiology was rarely known. Common biosecurity measures included nets covering pig pens (62%) and restricting access to the site/pens (46%). Boot dips (0.6%) and PPE (1.2%) were rarely used. Pig smallholdings remain an important contributor to rural livelihoods. Current practices and biosecurity challenges increase risk pathways for a range of disease threats of both local and global concern. Ethnographic studies are needed to better understand local determinants and develop context-appropriate strategies.

Keywords: smallholder production, swine, biosecurity practices, Cambodia, African swine fever

Procedia PDF Downloads 164
1195 Efficacy of Botulinum Toxin in Alleviating Pain Syndrome in Stroke Patients with Upper Limb Spasticity

Authors: Akulov M. A., Zaharov V. O., Jurishhev P. E., Tomskij A. A.

Abstract:

Introduction: Spasticity is a severe consequence of stroke, leading to profound disability, decreased quality of life and decrease of rehabilitation efficacy [4]. Spasticity is often associated with pain syndrome, arising from joint damage of paretic limbs (postural arthropathy) or painful spasm of paretic limb muscles. It is generally accepted that injection of botulinum toxin into a cramped muscle leads to decrease of muscle tone and improves motion range in paretic limb, which is accompanied by pain alleviation. Study aim: To evaluate the change in pain syndrome intensity after incections of botulinum toxin A (Xeomin) in stroke patients with upper limb spasticity. Patients and methods. 21 patients aged 47-74 years were evaluated. Inclusion criteria were: acute stroke 4-7 months before the inclusion into the study, leading to spasticity of wrist and/or finger flexors, elbow flexor or forearm pronator, associated with severe pain syndrome. Patients received Xeomin as monotherapy 90-300 U, according to spasticity pattern. Efficacy evaluation was performed using Ashworth scale, disability assessment scale (DAS), caregiver burden scale and global treatment benefit assessment on weeks 2, 4, 8 and 12. Efficacy criterion was the decrease of pain syndrome by week 4 on PQLS and VAS. Results: The study revealed a significant improvement of measured indices after 4 weeks of treatment, which persisted until the 12 week of treatment. Xeomin is effective in reducing muscle tone of flexors of wrist, fingers and elbow, forearm pronators. By the 4th week of treatment we observed a significant improvement on DAS (р < 0,05), Ashworth scale (1-2 points) in all patients (р < 0,05), caregiver burden scale (р < 0,05). A significant decrease of pain syndrome by the 4th week of treatment on PQLS (р < 0,05) и VAS (р < 0,05) was observed. No adverse effect were registered. Conclusion: Xeomin is an effective treatment of pain syndrome in postural upper limb spasticity after stroke. Xeomin treatment leads to a significant improvement on PQLS and VAS.

Keywords: botulinum toxin, pain syndrome, spasticity, stroke

Procedia PDF Downloads 298
1194 Content-Aware Image Augmentation for Medical Imaging Applications

Authors: Filip Rusak, Yulia Arzhaeva, Dadong Wang

Abstract:

Machine learning based Computer-Aided Diagnosis (CAD) is gaining much popularity in medical imaging and diagnostic radiology. However, it requires a large amount of high quality and labeled training image datasets. The training images may come from different sources and be acquired from different radiography machines produced by different manufacturers, digital or digitized copies of film radiographs, with various sizes as well as different pixel intensity distributions. In this paper, a content-aware image augmentation method is presented to deal with these variations. The results of the proposed method have been validated graphically by plotting the removed and added seams of pixels on original images. Two different chest X-ray (CXR) datasets are used in the experiments. The CXRs in the datasets defer in size, some are digital CXRs while the others are digitized from analog CXR films. With the proposed content-aware augmentation method, the Seam Carving algorithm is employed to resize CXRs and the corresponding labels in the form of image masks, followed by histogram matching used to normalize the pixel intensities of digital radiography, based on the pixel intensity values of digitized radiographs. We implemented the algorithms, resized the well-known Montgomery dataset, to the size of the most frequently used Japanese Society of Radiological Technology (JSRT) dataset and normalized our digital CXRs for testing. This work resulted in the unified off-the-shelf CXR dataset composed of radiographs included in both, Montgomery and JSRT datasets. The experimental results show that even though the amount of augmentation is large, our algorithm can preserve the important information in lung fields, local structures, and global visual effect adequately. The proposed method can be used to augment training and testing image data sets so that the trained machine learning model can be used to process CXRs from various sources, and it can be potentially used broadly in any medical imaging applications.

Keywords: computer-aided diagnosis, image augmentation, lung segmentation, medical imaging, seam carving

Procedia PDF Downloads 201
1193 Robust Inference with a Skew T Distribution

Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici

Abstract:

There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.

Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness

Procedia PDF Downloads 390
1192 Examining the Current Divisive State of American Political Discourse through the Lens of Peirce's Triadic Logical Structure and Pragmatist Metaphysics

Authors: Nathan Garcia

Abstract:

The polarizing dialogue of contemporary political America results from core philosophical differences. But these differences are beyond ideological and reach metaphysical distinction. Good intellectual historians have theorized that fundamental concepts such as freedom, God, and nature have been sterilized of their intellectual vigor. They are partially correct. 19th-century pragmatist Charles Sanders Peirce offers a penetrating philosophy which can yield greater insight into the contemporary political divide. Peirce argues that metaphysical and ethical issues are derivative of operational logic. His triadic logical structure and ensuing metaphysical principles constructed therefrom is contemporaneously applicable for three reasons. First, Peirce’s logic aptly scrutinizes the logical processes of liberal and conservative mindsets. Each group arrives at a cosmological root metaphor (abduction), resulting in a contemporary assessment (deduction), ultimately prompting attempts to verify the original abduction (induction). Peirce’s system demonstrates that liberal citizens develop a cosmological root metaphor in the concept of fairness (abduction), resulting in a contemporary assessment of, for example, underrepresented communities being unfairly preyed upon (deduction), thereby inciting anger toward traditional socio-political structures suspected of purposefully destabilizing minority communities (induction). Similarly, conservative citizens develop a cosmological root metaphor in the concept of freedom (abduction), resulting in a contemporary assessment of, for example, liberal citizens advocating an expansion of governmental powers (deduction), thereby inciting anger towards liberal communities suspected of attacking freedoms of ordinary Americans in a bid to empower their interests through the government (induction). The value of this triadic assessment is the categorization of distinct types of inferential logic by their purpose and boundaries. Only deductive claims can be concretely proven, while abductive claims are merely preliminary hypotheses, and inductive claims are accountable to interdisciplinary oversight. Liberals and conservative logical processes preclude constructive dialogue because of (a) an unshared abductive framework, and (b) misunderstanding the rules and responsibilities of their types of claims. Second, Peircean metaphysical principles offer a greater summary of the contemporaneously divisive political climate. His insights can weed through the partisan theorizing to unravel the underlying philosophical problems. Corrosive nominalistic and essentialistic presuppositions weaken the ability to share experiences and communicate effectively, both requisite for any promising constructive dialogue. Peirce’s pragmatist system can expose and evade fallacious thinking in pursuit of a refreshing alternative framework. Finally, Peirce’s metaphysical foundation enables a logically coherent, scientifically informed orthopraxis well-suited for American dialogue. His logical structure necessitates radically different anthropology conducive to shared experiences and dialogue within a dynamic, cultural continuum. Pierce’s fallibilism and sensitivity to religious sentiment successfully navigate between liberal and conservative values. In sum, he provides a normative paradigm for intranational dialogue that privileges individual experience and values morally defensible notions of freedom, God, and nature. Utilizing Peirce’s thought will yield fruitful analysis and offers a promising philosophical alternative for framing and engaging in contemporary American political discourse.

Keywords: Charles s. Peirce, american politics, logic, pragmatism

Procedia PDF Downloads 102
1191 EMPIRICAL ANALYSIS OF A GLOBAL IMPACT OF CONSUMER PRIVACY AND PROTECTION LAWS, ELECTRONIC TRANSACTION LAWS, PRIVACY AND DATA PROTECTION LAWS, AND CYBERCRIME LEGISLATION ON CYBER ATTACKS AND MALWARE TYPES: PROBLEMS AND PROSPECTS

Authors: Essang Anwana Onuntuei, Chinyere Blessing Azunwoke

Abstract:

The study aimed to probe how well cyber law operates worldwide, and then draw a logical conclusion on Nigeria’s experience using a deductive reasoning approach. With a purposive or structured sampling technique, seventy-eight countries (thirteen countries each from six continents of the world) were selected as sample size. The methods used for analysing the data include the Analysis of Variance (ANOVA), Pearson product-moment correlation and regression analysis, and multiple regression analysis methods respectively. At a two-tailed test of 0.05 confidence level, the results of findings established that about 23.74 (F calculated) which is > 2.23 (F critical) claimed the total cyber-attacks and malware types vary significantly. Also, at a two-tailed confidence level test of 0.05, 0.75 (F calculated) is < 1.7 (F critical), and the P-value = 0.73 to establish significantly that cybercrime legislation does not vary statistically. More so, the calculated value (tcalculated) = 7.305 is < table value (tcritical) = 12.05 at a two-tailed test of 0.05 to imply that electronic transactions law does not statistically impact the total number of cyber-attacks. The result also proved that Consumer Privacy and Protection law does not statistically impact the total number of cyber-attacks as the calculated value (tcalculated) = 6.21 < table value (tcritical) = 20.82 at a two-tailed test of 0.05. In addition, the calculated value (tcalculated) = 7.97 < table value (tcritical) = 14.76 at a two-tailed test of 0.05 implied that Privacy and Data Protection law does not statistically impact the total number of cyber-attacks worldwide. The calculated value (tcalculated) = 5.75 < table value (tcritical) = 12.65 at a two-tailed test of 0.05 to prove that cybercrime law does not statistically impact the total number of cyber-attacks. Finally, the calculated value (tcalculated) = 6.21 < table value (tcritical) = 20.82 at a two-tailed test of 0.05 concludes that combined multiple cyber laws do not significantly impact the total number of cyber-attacks worldwide. Recommendations were made based on the results of findings from the study.

Keywords: Cybercrime Legislation, Cyber Attacks, Consumer Privacy and Protection Law, Detection, Electronic Transaction Law, Prevention, Privacy and Data Protection Law, Prohibition, Prosecution

Procedia PDF Downloads 8
1190 Climate Species Lists: A Combination of Methods for Urban Areas

Authors: Andrea Gion Saluz, Tal Hertig, Axel Heinrich, Stefan Stevanovic

Abstract:

Higher temperatures, seasonal changes in precipitation, and extreme weather events are increasingly affecting trees. To counteract the increasing challenges of urban trees, strategies are increasingly being sought to preserve existing tree populations on the one hand and to prepare for the coming years on the other. One such strategy lies in strategic climate tree species selection. The search is on for species or varieties that can cope with the new climatic conditions. Many efforts in German-speaking countries deal with this in detail, such as the tree lists of the German Conference of Garden Authorities (GALK), the project Stadtgrün 2021, or the instruments of the Climate Species Matrix by Prof. Dr. Roloff. In this context, different methods for a correct species selection are offered. One possibility is to select certain physiological attributes that indicate the climate resilience of a species. To calculate the dissimilarity of the present climate of different geographic regions in relation to the future climate of any city, a weighted (standardized) Euclidean distance (SED) for seasonal climate values is calculated for each region of the Earth. The calculation was performed in the QGIS geographic information system, using global raster datasets on monthly climate values in the 1981-2010 standard period. Data from a European forest inventory were used to identify tree species growing in the calculated analogue climate regions. The inventory used is the compilation of georeferenced point data at a 1 km grid resolution on the occurrence of tree species in 21 European countries. In this project, the results of the methodological application are shown for the city of Zurich for the year 2060. In the first step, analog climate regions based on projected climate values for the measuring station Kirche Fluntern (ZH) were searched for. In a further step, the methods mentioned above were applied to generate tree species lists for the city of Zurich. These lists were then qualitatively evaluated with respect to the suitability of the different tree species for the Zurich area to generate a cleaned and thus usable list of possible future tree species.

Keywords: climate change, climate region, climate tree, urban tree

Procedia PDF Downloads 88
1189 Motivation of Doctors and its Impact on the Quality of Working Life

Authors: E. V. Fakhrutdinova, K. R. Maksimova, P. B. Chursin

Abstract:

At the present stage of the society progress the health care is an integral part of both the economic system and social, while in the second case the medicine is a major component of a number of basic and necessary social programs. Since the foundation of the health system are highly qualified health professionals, it is logical proposition that increase of doctor`s professionalism improves the effectiveness of the system as a whole. Professionalism of the doctor is a collection of many components, essential role played by such personal-psychological factors as honesty, willingness and desire to help people, and motivation. A number of researchers consider motivation as an expression of basic human needs that have passed through the “filter” which is a worldview and values learned in the process of socialization by the individual, to commit certain actions designed to achieve the expected result. From this point of view a number of researchers propose the following classification of highly skilled employee’s needs: 1. the need for confirmation the competence (setting goals that meet the professionalism and receipt of positive emotions in their decision), 2. The need for independence (the ability to make their own choices in contentious situations arising in the process carry out specialist functions), 3. The need for ownership (in the case of health care workers, to the profession and accordingly, high in the eyes of the public status of the doctor). Nevertheless, it is important to understand that in a market economy a significant motivator for physicians (both legal and natural persons) is to maximize its own profits. In the case of health professionals duality motivational structure creates an additional contrast, as in the public mind the image of the ideal physician; usually a altruistically minded person thinking is not primarily about their own benefit, and to assist others. In this context, the question of the real motivation of health workers deserves special attention. The survey conducted by the American researcher Harrison Terni for the magazine "Med Tech" in 2010 revealed the opinion of more than 200 medical students starting courses, and the primary motivation in a profession choice is "desire to help people", only 15% said that they want become a doctor, "to earn a lot". From the point of view of most of the classical theories of motivation this trend can be called positive, as intangible incentives are more effective. However, it is likely that over time the opinion of the respondents may change in the direction of mercantile motives. Thus, it is logical to assume that well-designed system of motivation of doctor`s labor should be based on motivational foundations laid during training in higher education.

Keywords: motivation, quality of working life, health system, personal-psychological factors, motivational structure

Procedia PDF Downloads 345
1188 Is Socio-Economic Characteristic is Associated with Health-Related Quality of Life among Elderly: Evidence from SAGE Data in India

Authors: Mili Dutta, Lokender Prashad

Abstract:

Introduction: Population ageing is a phenomenon that can be observed around the globe. The health-related quality of life (HRQOL) is a measurement of health status of an individual, and it describes the effect of physical and mental health disorders on the well-being of a person. The present study is aimed to describe the influence of socio-economic characteristics of elderly on their health-related quality of life in India. Methods: EQ-5D instrument and population-based EQ-5D index score has been measured to access the HRQOL among elderly. Present study utilized the Study on Global Ageing and Adult Health (SAGE) data which was conducted in 2007 in India. Multiple Logistic Regression model and Multivariate Linear Regression model has been employed. Result: In the present study, it was found that the female are more likely to have problems in mobility (OR=1.41, 95% Cl: 1.14 to 1.74), self-care (OR=1.26, 95% Cl: 1.01 to 1.56) and pain or discomfort (OR=1.50, 95% Cl: 1.16 to 1.94). Elderly residing in rural area are more likely to have problems in pain/discomfort (OR=1.28, 95% Cl: 1.01 to 1.62). More older and non-working elderly are more likely whereas higher educated and highest wealth quintile elderly are less likely to have problems in all the dimensions of EQ-5D viz. mobility, self-care, usual activity, pain/discomfort and anxiety/depression. The present study has also shown that oldest old people, residing in rural area and currently not working elderly are more likely to report low EQ-5D index score whereas elderly with high education level and high wealth quintile are more likely to report high EQ-5D index score than their counterparts. Conclusion: The present study has found EQ-5D instrument as the valid measure for assessing the HRQOL of elderly in India. The study indicates socio-economic characteristics of elderly such as female, more older people, residing in rural area, non-educated, poor and currently non-working as the major risk groups of having poor HRQOL in India. Findings of the study will be helpful for the programmes and policy makers, researchers, academician and social workers who are working in the field of ageing.

Keywords: ageing, HRQOL, India, EQ-5D, SAGE, socio-economic characteristics

Procedia PDF Downloads 390
1187 Prioritizing Biodiversity Conservation Areas based on the Vulnerability and the Irreplaceability Framework in Mexico

Authors: Alma Mendoza-Ponce, Rogelio Corona-Núñez, Florian Kraxner

Abstract:

Mexico is a megadiverse country and it has nearly halved its natural vegetation in the last century due to agricultural and livestock expansion. Impacts of land use cover change and climate change are unevenly distributed and spatial prioritization to minimize the affectations on biodiversity is crucial. Global and national efforts for prioritizing biodiversity conservation show that ~33% to 45% of Mexico should be protected. The width of these targets makes difficult to lead resources. We use a framework based on vulnerability and irreplaceability to prioritize conservation efforts in Mexico. Vulnerability considered exposure, sensitivity and adaptive capacity under two scenarios (business as usual, BAU based, on the SSP2 and RCP 4.5 and a Green scenario, based on the SSP1 and the RCP 2.6). Exposure to land use is the magnitude of change from natural vegetation to anthropogenic covers while exposure to climate change is the difference between current and future values for both scenarios. Sensitivity was considered as the number of endemic species of terrestrial vertebrates which are critically endangered and endangered. Adaptive capacity is used as the ration between the percentage of converted area (natural to anthropogenic) and the percentage of protected area at municipality level. The results suggest that by 2050, between 11.6 and 13.9% of Mexico show vulnerability ≥ 50%, and by 2070, between 12.0 and 14.8%, in the Green and BAU scenario, respectively. From an ecosystem perspective cloud forests, followed by tropical dry forests, natural grasslands and temperate forests will be the most vulnerable (≥ 50%). Amphibians are the most threatened vertebrates; 62% of the endemic amphibians are critically endangered or endangered while 39%, 12% and 9% of the mammals, birds, and reptiles, respectively. However, the distribution of these amphibians counts for only 3.3% of the country, while mammals, birds, and reptiles in these categories represent 10%, 16% and 29% of Mexico. There are 5 municipalities out of the 2,457 that Mexico has that represent 31% of the most vulnerable areas (70%).These municipalities account for 0.05% of Mexico. This multiscale approach can be used to address resources to conservation targets as ecosystems, municipalities or species considering land use cover change, climate change and biodiversity uniqueness.

Keywords: biodiversity, climate change, land use change, Mexico, vulnerability

Procedia PDF Downloads 153
1186 Renewable Energy Integration in Cities of Developing Countries: The Case Study of Tema City, Ghana

Authors: Marriette Sakah, Christoph Kuhn, Samuel Gyamfi

Abstract:

Global electricity demand of households in 2005 is estimated to double by 2025 and nearly double again in 2030. The residential sector promises considerable demand growth through infrastructural and equipment investments, the majority of which is projected to occur in developing countries. This lays bare the urgency for enhanced efficiency in all energy systems combined with exploitation of local potential for renewable energy systems. This study explores options for reducing energy consumption, particularly in residential buildings and providing robust, decentralized and renewable energy supply for African cities. The potential of energy efficiency measures and the potential of harnessing local resources for renewable energy supply are quantitatively assessed. The scale of research specifically addresses the city level, which is regulated by local authorities. Local authorities can actively promote the transition to a renewable-based energy supply system by promoting energy efficiency and the use of alternative renewable fuels in existing buildings, and particularly in planning and development of new settlement areas through the use of incentives, regulations, and demonstration projects. They can also support a more sustainable development by shaping local land use and development patterns in such ways that reduce per capita energy consumption and are benign to the environment. The subject of the current case study, Tema, is Ghana´s main industrial hub, a port city and home to 77,000 families. Residential buildings in Tema consumed 112 GWh of electricity in 2013 or 1.45 MWh per household. If average household electricity demand were to decline at an annual rate of just 2 %, by 2035 Tema would consume only 134 GWh of electricity despite an expected increase in the number of households by 84 %. The work is based on a ground survey of the city’s residential sector. The results show that efficient technologies and decentralized renewable energy systems have great potential for meeting the rapidly growing energy demand of cities in developing countries.

Keywords: energy efficiency, energy saving potential, renewable energy integration, residential buildings, urban Africa

Procedia PDF Downloads 271
1185 Leading, Teaching and Learning “in the Middle”: Experiences, Beliefs, and Values of Instructional Leaders, Teachers, and Students in Finland, Germany, and Canada

Authors: Brandy Yee, Dianne Yee

Abstract:

Through the exploration of the lived experiences, beliefs and values of instructional leaders, teachers and students in Finland, Germany and Canada, we investigated the factors which contribute to developmentally responsive, intellectually engaging middle-level learning environments for early adolescents. Student-centred leadership dimensions, effective instructional practices and student agency were examined through the lens of current policy and research on middle-level learning environments emerging from the Canadian province of Manitoba. Consideration of these three research perspectives in the context of early adolescent learning, placed against an international backdrop, provided a previously undocumented perspective on leading, teaching and learning in the middle years. Aligning with a social constructivist, qualitative research paradigm, the study incorporated collective case study methodology, along with constructivist grounded theory methods of data analysis. Data were collected through semi-structured individual and focus group interviews and document review, as well as direct and participant observation. Three case study narratives were developed to share the rich stories of study participants, who had been selected using maximum variation and intensity sampling techniques. Interview transcript data were coded using processes from constructivist grounded theory. A cross-case analysis yielded a conceptual framework highlighting key factors that were found to be significant in the establishment of developmentally responsive, intellectually engaging middle-level learning environments. Seven core categories emerged from the cross-case analysis as common to all three countries. Within the visual conceptual framework (which depicts the interconnected nature of leading, teaching and learning in middle-level learning environments), these seven core categories were grouped into Essential Factors (student agency, voice and choice), Contextual Factors (instructional practices; school culture; engaging families and the community), Synergistic Factors (instructional leadership) and Cornerstone Factors (education as a fundamental cultural value; preservice, in-service and ongoing teacher development). In addition, sub-factors emerged from recurring codes in the data and identified specific characteristics and actions found in developmentally responsive, intellectually engaging middle-level learning environments. Although this study focused on 12 schools in Finland, Germany and Canada, it informs the practice of educators working with early adolescent learners in middle-level learning environments internationally. The authentic voices of early adolescent learners are the most important resource educators have to gauge if they are creating effective learning environments for their students. Ongoing professional dialogue and learning is essential to ensure teachers are supported in their work and develop the pedagogical practices needed to meet the needs of early adolescent learners. It is critical to balance consistency, coherence and dependability in the school environment with the necessary flexibility in order to support the unique learning needs of early adolescents. Educators must intentionally create a school culture that unites teachers, students and their families in support of a common purpose, as well as nurture positive relationships between the school and its community. A large, urban school district in Canada has implemented a school cohort-based model to begin to bring developmentally responsive, intellectually engaging middle-level learning environments to scale.

Keywords: developmentally responsive learning environments, early adolescents, middle level learning, middle years, instructional leadership, instructional practices, intellectually engaging learning environments, leadership dimensions, student agency

Procedia PDF Downloads 288
1184 Apathetic Place, Hostile Space: A Qualitative Study on the Ability of Immigration Detention in the UK to Promote the Health and Dignity of Detainees

Authors: P. Dhesi, R. Burns

Abstract:

Background: The UK has one of the largest immigration detention estates in Europe and is under increasing scrutiny, particularly regarding the lack of transparency over the use of detention and the conditions. Therefore, this research seeks to explore the professional perceptions of the ability of immigration detention in the UK to promote health and dignity. Methods: A phenomenological approach to qualitative methods were used, with social constructivist theorisations of health and dignity. Seven semi-structured interviews were conducted using Microsoft Teams. Participants included a range of immigration detention stakeholders who have visited closed immigration detention centres in the UK in a professional capacity. Recorded interviews were transcribed verbatim, and analysis was data-driven through inductive reflexive thematic analysis of the entire data set to account for the small sample size. This study received ethical approval from University College London Research Ethics Committee. Results: Two global themes were created through analysis: apathetic place and hostile space. Apathetic place discusses the lack of concern for detainees' daily living and healthcare needs within immigration detention in the UK. This is explored through participants' perceptions of the lack of ability of monitoring and evaluation processes to ensure detainees are able to live with dignity and understand the unfulfilled duty of care that exists in detention. Hostile space discusses immigration detention in the UK as a wider system of hostility. This is explored through the disempowering impact on detainees, the perception of a failing system as a result of inadequate safeguarding procedures, and a belief that the intention of immigration detention is misaligned with its described purpose. Conclusion: This research explains why the current immigration detention system in the UK is unable to promote health and dignity, offering a social justice and action-orientated approach to research in this sphere. The findings strengthen the discourse against the use of detention as an immigration control tool in the UK. Implications for further research include a stronger emphasis on investigating alternatives to detention and culturally considerate opportunities for patient-centred healthcare.

Keywords: access to healthcare, dignity, health, immigration detention, migrant, refugee, UK

Procedia PDF Downloads 87
1183 Sustainable Integrated Waste Management System

Authors: Lidia Lombardi

Abstract:

Waste management in Europe and North America is evolving towards sustainable materials management, intended as a systemic approach to using and reusing materials more productively over their entire life cycles. Various waste management strategies are prioritized and ranked from the most to the least environmentally preferred, placing emphasis on reducing, reusing, and recycling as key to sustainable materials management. However, non-recyclable materials must also be appropriately addressed, and waste-to-energy (WtE) offers a solution to manage them, especially when a WtE plant is integrated within a complex system of waste and wastewater treatment plants and potential users of the output flows. To evaluate the environmental effects of such system integration, Life Cycle Assessment (LCA) is a helpful and powerful tool. LCA has been largely applied to the waste management sector, dating back to the late 1990s, producing a large number of theoretical studies and applications to the real world as support to waste management planning. However, LCA still has a fundamental role in helping the development of waste management systems supporting decisions. Thus, LCA was applied to evaluate the environmental performances of a Municipal Solid Waste (MSW) management system, with improved separate material collection and recycling and an integrated network of treatment plants including WtE, anaerobic digestion (AD) and also wastewater treatment plant (WWTP), for a reference study case area. The proposed system was compared to the actual situation, characterized by poor recycling, large landfilling and absence of WtE. The LCA results showed that the increased recycling significantly increases the environmental performances, but there is still room for improvement through the introduction of energy recovery (especially by WtE) and through its use within the system, for instance, by feeding the heat to the AD, to sludge recovery processes and supporting the water reuse practice. WtE offers a solution to manage non-recyclable MSW and allows saving important resources (such as landfill volumes and non-renewable energy), reducing the contribution to global warming, and providing an essential contribution to fulfill the goals of really sustainable waste management.

Keywords: anaerobic digestion, life cycle assessment, waste-to-energy, municipal solid waste

Procedia PDF Downloads 48
1182 A Novel Method to Manufacture Superhydrophobic and Insulating Polyester Nanofibers via a Meso-Porous Aerogel Powder

Authors: Z. Mazrouei-Sebdani, A. Khoddami, H. Hadadzadeh, M. Zarrebini

Abstract:

Silica aerogels are well-known meso-porous materials with high specific surface area (500–1000 m2/g), high porosity (80–99.8%), and low density (0.003–0.8 g/cm3). However, the silica aerogels generally are highly brittle due to their nanoporous nature. Physical and mechanical properties of the silica aerogels can be enhanced by compounding with the fibers. Although some reports presented incorporation of the fibers into the sol, followed by further modification and drying stages, no information regarding the aerogel powders as filler in the polymeric fibers is available. In this research, waterglass based aerogel powder was prepared in the following steps: sol–gel process to prepare a gel, followed by subsequent washing with propan-2-ol, n-Hexane, and TMCS, then ambient pressure drying, and ball milling. Inspired by limited dust releasing, aerogel powder was introduced to the PET electrospinning solution in an attempt to create required bulk and surface structure for the nano fibers to improve their hydrophobic and insulation properties. The samples evaluation was carried out by measuring density, porosity, contact angle, sliding angle, heat transfer, FTIR, BET and SEM. According to the results, porous silica aerogel powder was fabricated with mean pore diameter of 24 nm and contact angle of 145.9º. The results indicated the usefulness of the aerogel powder confined into nano fibers to control surface roughness for manipulating superhydrophobic nanowebs with sliding angle of 5˚ and water contact angle of 147º. It can be due to a multi-scale surface roughness which was created by nanowebs structure itself and nano fibers surface irregularity in presence of the aerogels while a laye of fluorocarbon created low surface energy. The wettability of a solid substrate is an important property that is controlled by both the chemical composition and geometry of the surface. Also, a decreasing trend in the heat transfer was observed from 22% for the nano fibers without any aerogel powder to 8% for the nano fibers with 4% aerogel powder. The development of thermal insulating materials has become increasingly more important than ever in view of the fossil energy depletion and global warming that call for more demanding energy-saving practices.

Keywords: Superhydrophobicity, Insulation, Sol-gel, Surface energy, Roughness.

Procedia PDF Downloads 316
1181 Changes to Populations Might Aid the Spread Antibiotic Resistance in the Environment

Authors: Yasir Bashawri, Vincent N. Chigor James McDonald, Merfyn Williams, Davey Jones, A. Prysor Williams

Abstract:

Resistance to antibiotics has become a threat to public health. As a result of their misuse and overuse, bacteria have become resistant to many common antibiotics. Βeta lactam (β-lactam) antibiotics are one of the most significant classes of antimicrobials in providing therapeutic benefits for the treatment of bacterial infections in both human and veterinary medicine, for approximately 60% of all antibiotics are used. In particular, some Enterobacteriaceae produce Extend Spectrum Beta Lactamases (ESBLs) that enable them to some break down multi-groups of antibiotics. CTX-M enzymes have rapidly become the most important ESBLs, with increases in mainly CTX-M 15 in many countries during the last decade. Global travel by intercontinental medical ‘tourists’, migrant employees and overseas students could theoretically be a risk factor for spreading antibiotic resistance genes in different parts of the world. Bangor city, North Wales, is subject to sudden demographic changes due to a large proportion (>25%) of the population being students, most of which arrive over a space of days. This makes it a suitable location to study the impacts of large demographic change on the presence of ESBLs. The aim of this study is to monitor the presence of ESBLs in Escherichia coli and faecal coliform bacteria isolated from Bangor wastewater treatment plant, before, during and after the arrival week of students to Bangor University. Over a five-week period, water samples were collected twice a week, from the influent, primary sedimentation tank, aeration tank and the final effluent. Isolation and counts for Escherichia coli and other faecal coliforms were done on selective agar (primary UTI agar). ESBL presence will be confirmed by phenotypic and genotypic methods. Sampling at all points of the tertiary treatment stages will indicate the effectiveness of wastewater treatment in reducing the spread of ESBLs genes. The study will yield valuable information to help tackle a problem which many regard to be the one of the biggest threats to modern-day society.

Keywords: extended spectrum β-lactamase, enterobacteriaceae, international travel, wastewater treatment plant

Procedia PDF Downloads 356
1180 Resilience Compendium: Strategies to Reduce Communities' Risk to Disasters

Authors: Caroline Spencer, Suzanne Cross, Dudley McArdle, Frank Archer

Abstract:

Objectives: The evolution of the Victorian Compendium of Community-Based Resilience Building Case Studies and its capacity to help communities implement activities that encourage adaptation to disaster risk reduction and promote community resilience in rural and urban locations provide this paper's objectives. Background: Between 2012 and 2019, community groups presented at the Monash University Disaster Resilience Initiative (MUDRI) 'Advancing Community Resilience Annual Forums', provided opportunities for communities to impart local resilience activities, how to solve challenges and share unforeseen learning and be considered for inclusion in the Compendium. A key tenet of the Compendium encourages compiling and sharing of grass-roots resilience building activities to help communities before, during, and after unexpected emergencies. The online Compendium provides free access for anyone wanting to help communities build expertise, reduce program duplication, and save valuable community resources. Identifying case study features across the emergency phases and analyzing critical success factors helps communities understand what worked and what did not work to achieve success and avoid known barriers. International exemplars inform the Compendium, which represents an Australian first and enhances Victorian community resilience initiatives. Emergency Management Victoria provided seed funding for the Compendium. MUDRI matched this support and continues to fund the project. A joint Steering Committee with broad-based user input and Human ethics approval guides its continued growth. Methods: A thematic analysis of the Compendium identified case study features, including critical success factors. Results: The Compendium comprises 38 case studies, representing all eight Victorian regions. Case studies addressed emergency phases, before (29), during (7), and after (17) events. Case studies addressed all hazards (23), bushfires (11), heat (2), fire safety (1), and house fires (1). Twenty case studies used a framework. Thirty received funding, of which nine received less than $20,000 and five received more than $100,000. Twenty-nine addressed a whole of community perspective. Case studies revealed unique and valuable learning in diverse settings. Critical success factors included strong governance; board support, leadership, and trust; partnerships; commitment, adaptability, and stamina; community-led initiatives. Other success factors included a paid facilitator and local government support; external funding, and celebrating success. Anecdotally, we are aware that community groups reference Compendium and that its value adds to community resilience planning. Discussion: The Compendium offers an innovative contribution to resilience research and practice. It augments the seven resilience characteristics to strengthen and encourage communities as outlined in the Statewide Community Resilience Framework for Emergency Management; brings together people from across sectors to deliver distinct, yet connected actions to strengthen resilience as a part of the Rockefeller funded Resilient Melbourne Strategy, and supports communities and economies to be resilient when a shock occurs as identified in the recently published Australian National Disaster Risk Reduction Framework. Each case study offers learning about connecting with community and how to increase their resilience to disaster risks and to keep their community safe from unexpected emergencies. Conclusion: The Compendium enables diverse communities to adopt or adapt proven resilience activities, thereby preserving valuable community resources and offers the opportunity to extend to a national or international Compendium.

Keywords: case study, community, compendium, disaster risk reduction, resilience

Procedia PDF Downloads 106
1179 Violence against Children Surveys: Analysis of the Peer-Reviewed Literature from 2009-2019

Authors: Kathleen Cravero, Amanda Nace, Samantha Ski

Abstract:

The Violence Against Children Surveys (VACS) is nationally representative surveys of male and female youth ages 13-24, designed to measure the burden of sexual, physical, and emotional violence experienced in childhood and adolescence. As of 2019, 24 countries implemented or are in the process of implementing a VACS, covering over ten percent of the world’s child population. Since the first article using VACS data from Swaziland was published in 2009, several peer-reviewed articles have been published on the VACS. However, no publications to date have analyzed the breadth of the work and analyzed how the data are represented in the peer-reviewed literature. In this study, we conducted a literature review of all peer-reviewed research that used VACS data or discussed the implementation and methodology of the VACS. The literature review revealed several important findings. Between 2009 and July 2019, thirty-five peer-reviewed articles using VACS data from 12 countries have been published. Twenty of the studies focus on one country, while 15 of the studies focus on two or more countries. Some countries are featured in the literature more than others, for example Kenya (N=14), Malawi (N=12), and Tanzania (N=12). A review of the research by gender demonstrates that research on violence against boys is under-represented. Only two studies specifically focused on boys/young men, while 11 studies focused only on violence against girls. This is despite research which suggests boys and girls experience similar rates of violence. A review of the publications by type of violence revealed significant differences in the types of violence being featured in the literature. Thirteen publications specifically focused on sexual violence, while three studies focused on physical violence, and only one study focused on emotional violence. Almost 70% of the peer-reviewed articles (24 of the 35) were first-authored by someone at the U.S. Centers for Disease Control and Prevention. There were very few first authors from VACS countries, which raises questions about who is leveraging the data and the extent to which capacities for data liberation are being developed within VACS countries. The VACS provide an unprecedented amount of information on the prevalence and past-year incidence of violence against children. Through a review of the peer-reviewed literature on the VACS we can begin to identify trends and gaps in how the data is being used as well as identify areas for further research.

Keywords: data to action, global health, implementation science, violence against children surveys

Procedia PDF Downloads 121
1178 Investigation and Comprehensive Benefit Analysis of 11 Typical Polar-Based Agroforestry Models Based on Analytic Hierarchy Process in Anhui Province, Eastern China

Authors: Zhihua Cao, Hongfei Zhao, Zhongneng Wu

Abstract:

The development of polar-based agroforestry was necessary due to the influence of the timber market environment in China, which can promote the coordinated development of forestry and agriculture, and gain remarkable ecological, economic and social benefits. The main agroforestry models of the main poplar planting area in Huaibei plain and along the Yangtze River plain were carried out. 11 typical management models of poplar were selected to sum up: pure poplar forest, poplar-rape-soybean, poplar-wheat-soybean, poplar-rape-cotton, poplar-wheat, poplar-chicken, poplar-duck, poplar-sheep, poplar-Agaricus blazei, poplar-oil peony, poplar-fish, represented by M0-M10, respectively. 12 indexes related with economic, ecological and social benefits (annual average cost, net income, ratio of output to investment, payback period of investment, land utilization ratio, utilization ratio of light energy, improvement and system stability of ecological and production environment, product richness, labor capacity, cultural quality of labor force, sustainability) were screened out to carry on the comprehensive evaluation and analysis to 11 kinds of typical agroforestry models based on analytic hierarchy process (AHP). The results showed that the economic benefit of each agroforestry model was in the order of: M8 > M6 > M9 > M7 > M5 > M10 > M4 > M1 > M2 > M3 > M0. The economic benefit of poplar-A. blazei model was the highest (332, 800 RMB / hm²), followed by poplar-duck and poplar-oil peony model (109, 820RMB /hm², 5, 7226 RMB /hm²). The order of comprehensive benefit was: M8 > M4 > M9 > M6 > M1 > M2 > M3 > M7 > M5 > M10 > M0. The economic benefit and comprehensive benefit of each agroforestry model were higher than that of pure poplar forest. The comprehensive benefit of poplar-A. blazei model was the highest, and that of poplar-wheat model ranked second, while its economic benefit was not high. Next were poplar-oil peony and poplar-duck models. It was suggested that the model of poplar-wheat should be adopted in the plain along the Yangtze River, and the whole cycle mode of poplar-grain, popalr-A. blazei, or poplar-oil peony should be adopted in Huaibei plain, northern Anhui. Furthermore, wheat, rape, and soybean are the main crops before the stand was closed; the agroforestry model of edible fungus or Chinese herbal medicine can be carried out when the stand was closed in order to maximize the comprehensive benefit. The purpose of this paper is to provide a reference for forest farmers in the selection of poplar agroforestry model in the future and to provide the basic data for the sustainable and efficient study of poplar agroforestry in Anhui province, eastern China.

Keywords: agroforestry, analytic hierarchy process (AHP), comprehensive benefit, model, poplar

Procedia PDF Downloads 147
1177 Isolation and Molecular Characterization of Lytic Bacteriophage against Carbapenem Resistant Klebsiella pneumoniae

Authors: Guna Raj Dhungana, Roshan Nepal, Apshara Parajuli, , Archana Maharjan, Shyam K. Mishra, Pramod Aryal, Rajani Malla

Abstract:

Introduction: Klebsiella pneumoniae is a well-known opportunistic human pathogen, primarily causing healthcare-associated infections. The global emergence of carbapenemase-producing K. pneumoniaeis a major public health burden, which is often extensively multidrug resistant.Thus, because of the difficulty to treat these ‘superbug’ and menace and some term as ‘apocalypse’ of post antibiotics era, an alternative approach to controlling this pathogen is prudent and one of the approaches is phage mediated control and/or treatment. Objective: In this study, we aimed to isolate novel bacteriophage against carbapenemase-producing K. pneumoniaeand characterize for potential use inphage therapy. Material and Methods: Twenty lytic phages were isolated from river water using double layer agar assay and purified. Biological features, physiochemical characters, burst size, host specificity and activity spectrum of phages were determined. One most potent phage: Phage TU_Kle10O was selected and characterized by electron microscopy. Whole genome sequences of the phage were analyzed for presence/absence of virulent factors, and other lysin genes. Results: Novel phage TU_Kle10O showed multiple host range within own genus and did not induce any BIM up to 5th generation of host’s life cycle. Electron microscopy confirmed that the phage was tailed and belonged to Caudovirales family. Next generation sequencing revealed its genome to be 166.2 Kb. bioinformatical analysis further confirmed that the phage genome ‘did not’ contain any ‘bacterial genes’ within phage genome, which ruled out the concern for transfer of virulent genes. Specific 'lysin’ enzyme was identified phages which could be used as 'antibiotics'. Conclusion: Extensively multidrug resistant bacteria like carbapenemase-producing K. pneumoniaecould be treated efficiently by phages.Absence of ‘virulent’ genes of bacterial origin and presence of lysin proteins within phage genome makes phages an excellent candidate for therapeutics.

Keywords: bacteriophage, Klebsiella pneumoniae, MDR, phage therapy, carbapenemase,

Procedia PDF Downloads 175
1176 Investigation of Turbulent Flow in a Bubble Column Photobioreactor and Consequent Effects on Microalgae Cultivation Using Computational Fluid Dynamic Simulation

Authors: Geetanjali Yadav, Arpit Mishra, Parthsarathi Ghosh, Ramkrishna Sen

Abstract:

The world is facing problems of increasing global CO2 emissions, climate change and fuel crisis. Therefore, several renewable and sustainable energy alternatives should be investigated to replace non-renewable fuels in future. Algae presents itself a versatile feedstock for the production of variety of fuels (biodiesel, bioethanol, bio-hydrogen etc.) and high value compounds for food, fodder, cosmetics and pharmaceuticals. Microalgae are simple microorganisms that require water, light, CO2 and nutrients for growth by the process of photosynthesis and can grow in extreme environments, utilize waste gas (flue gas) and waste waters. Mixing, however, is a crucial parameter within the culture system for the uniform distribution of light, nutrients and gaseous exchange in addition to preventing settling/sedimentation, creation of dark zones etc. The overarching goal of the present study is to improve photobioreactor (PBR) design for enhancing dissolution of CO2 from ambient air (0.039%, v/v), pure CO2 and coal-fired flue gas (10 ± 2%) into microalgal PBRs. Computational fluid dynamics (CFD), a state-of-the-art technique has been used to solve partial differential equations with turbulence closure which represents the dynamics of fluid in a photobioreactor. In this paper, the hydrodynamic performance of the PBR has been characterized and compared with that of the conventional bubble column PBR using CFD. Parameters such as flow rate (Q), mean velocity (u), mean turbulent kinetic energy (TKE) were characterized for each experiment that was tested across different aeration schemes. The results showed that the modified PBR design had superior liquid circulation properties and gas-liquid transfer that resulted in creation of uniform environment inside PBR as compared to conventional bubble column PBR. The CFD technique has shown to be promising to successfully design and paves path for a future research in order to develop PBRs which can be commercially available for scale-up microalgal production.

Keywords: computational fluid dynamics, microalgae, bubble column photbioreactor, flue gas, simulation

Procedia PDF Downloads 222
1175 Study of Climate Change Process on Hyrcanian Forests Using Dendroclimatology Indicators (Case Study of Guilan Province)

Authors: Farzad Shirzad, Bohlol Alijani, Mehry Akbary, Mohammad Saligheh

Abstract:

Climate change and global warming are very important issues today. The process of climate change, especially changes in temperature and precipitation, is the most important issue in the environmental sciences. Climate change means changing the averages in the long run. Iran is located in arid and semi-arid regions due to its proximity to the equator and its location in the subtropical high pressure zone. In this respect, the Hyrcanian forest is a green necklace between the Caspian Sea and the south of the Alborz mountain range. In the forty-third session of UNESCO, it was registered as the second natural heritage of Iran. Beech is one of the most important tree species and the most industrial species of Hyrcanian forests. In this research, using dendroclimatology, the width of the tree ring, and climatic data of temperature and precipitation from Shanderman meteorological station located in the study area, And non-parametric Mann-Kendall statistical method to investigate the trend of climate change over a time series of 202 years of growth ringsAnd Pearson statistical method was used to correlate the growth of "ring" growth rings of beech trees with climatic variables in the region. The results obtained from the time series of beech growth rings showed that the changes in beech growth rings had a downward and negative trend and were significant at the level of 5% and climate change occurred. The average minimum, medium, and maximum temperatures and evaporation in the growing season had an increasing trend, and the annual precipitation had a decreasing trend. Using Pearson method during fitting the correlation of diameter of growth rings with temperature, for the average in July, August, and September, the correlation is negative, and the average temperature in July, August, and September is negative, and for the average The average maximum temperature in February was correlation-positive and at the level of 95% was significant, and with precipitation, in June the correlation was at the level of 95% positive and significant.

Keywords: climate change, dendroclimatology, hyrcanian forest, beech

Procedia PDF Downloads 88
1174 Hansen Solubility Parameters, Quality by Design Tool for Developing Green Nanoemulsion to Eliminate Sulfamethoxazole from Contaminated Water

Authors: Afzal Hussain, Mohammad A. Altamimi, Syed Sarim Imam, Mudassar Shahid, Osamah Abdulrahman Alnemer

Abstract:

Exhaustive application of sulfamethoxazole (SUX) became as a global threat for human health due to water contamination through diverse sources. The addressed combined application of Hansen solubility (HSPiP software) parameters and Quality by Design tool for developing various green nanoemulsions. HSPiP program assisted to screen suitable excipients based on Hansen solubility parameters and experimental solubility data. Various green nanoemulsions were prepared and characterized for globular size, size distribution, zeta potential, and removal efficiency. Design Expert (DoE) software further helped to identify critical factors responsible to have direct impact on percent removal efficiency, size, and viscosity. Morphological investigation was visualized under transmission electron microscopy (TEM). Finally, the treated was studied to negate the presence of the tested drug employing ICP-OES (inductively coupled plasma optical emission microscopy) technique and HPLC (high performance liquid chromatography). Results showed that HSPiP predicted biocompatible lipid, safe surfactant (lecithin), and propylene glycol (PG). Experimental solubility of the drug in the predicted excipients were quite convincing and vindicated. Various green nanoemulsions were fabricated, and these were evaluated for in vitro findings. Globular size (100-300 nm), PDI (0.1-0.5), zeta potential (~ 25 mV), and removal efficiency (%RE = 70-98%) were found to be in acceptable range for deciding input factors with level in DoE. Experimental design tool assisted to identify the most critical variables controlling %RE and optimized content of nanoemulsion under set constraints. Dispersion time was varied from 5-30 min. Finally, ICP-OES and HPLC techniques corroborated the absence of SUX in the treated water. Thus, the strategy is simple, economic, selective, and efficient.

Keywords: quality by design, sulfamethoxazole, green nanoemulsion, water treatment, icp-oes, hansen program (hspip software

Procedia PDF Downloads 63
1173 Fabrication of Aluminum Nitride Thick Layers by Modified Reactive Plasma Spraying

Authors: Cécile Dufloux, Klaus Böttcher, Heike Oppermann, Jürgen Wollweber

Abstract:

Hexagonal aluminum nitride (AlN) is a promising candidate for several wide band gap semiconductor compound applications such as deep UV light emitting diodes (UVC LED) and fast power transistors (HEMTs). To date, bulk AlN single crystals are still commonly grown from the physical vapor transport (PVT). Single crystalline AlN wafers obtained from this process could offer suitable substrates for a defect-free growth of ultimately active AlGaN layers, however, these wafers still lack from small sizes, limited delivery quantities and high prices so far.Although there is already an increasing interest in the commercial availability of AlN wafers, comparatively cheap Si, SiC or sapphire are still predominantly used as substrate material for the deposition of active AlGaN layers. Nevertheless, due to a lattice mismatch up to 20%, the obtained material shows high defect densities and is, therefore, less suitable for high power devices as described above. Therefore, the use of AlN with specially adapted properties for optical and sensor applications could be promising for mass market products which seem to fulfill fewer requirements. To respond to the demand of suitable AlN target material for the growth of AlGaN layers, we have designed an innovative technology based on reactive plasma spraying. The goal is to produce coarse grained AlN boules with N-terminated columnar structure and high purity. In this process, aluminum is injected into a microwave stimulated nitrogen plasma. AlN, as the product of the reaction between aluminum powder and the plasma activated N2, is deposited onto the target. We used an aluminum filament as the initial material to minimize oxygen contamination during the process. The material was guided through the nitrogen plasma so that the mass turnover was 10g/h. To avoid any impurity contamination by an erosion of the electrodes, an electrode-less discharge was used for the plasma ignition. The pressure was maintained at 600-700 mbar, so the plasma reached a temperature high enough to vaporize the aluminum which subsequently was reacting with the surrounding plasma. The obtained products consist of thick polycrystalline AlN layers with a diameter of 2-3 cm. The crystallinity was determined by X-ray crystallography. The grain structure was systematically investigated by optical and scanning electron microscopy. Furthermore, we performed a Raman spectroscopy to provide evidence of stress in the layers. This paper will discuss the effects of process parameters such as microwave power and deposition geometry (specimen holder, radiation shields, ...) on the topography, crystallinity, and stress distribution of AlN.

Keywords: aluminum nitride, polycrystal, reactive plasma spraying, semiconductor

Procedia PDF Downloads 271
1172 Bundling of Transport Flows: Adoption Barriers and Opportunities

Authors: Vandenbroucke Karel, Georges Annabel, Schuurman Dimitri

Abstract:

In the past years, bundling of transport flows, whether or not implemented in an intermodal process, has popped up as a promising concept in the logistics sector. Bundling of transport flows is a process where two or more shippers decide to synergize their shipped goods over a common transport lane. Promoted by the European Commission, several programs have been set up and have shown their benefits. Bundling promises both shippers and logistics service providers economic, societal and ecological benefits. By bundling transport flows and thus reducing truck (or other carrier) capacity, the problems of driver shortage, increased fuel prices, mileage charges and restricted hours of service on the road are solved. In theory, the advantages of bundled transport exceed the drawbacks, however, in practice adoption among shippers remains low. In fact, bundling is mentioned as a disruptive process in the rather traditional logistics sector. In this context, a Belgian company asked iMinds Living Labs to set up a Living Lab research project with the goal to investigate how the uptake of bundling transport flows can be accelerated and to check whether an online data sharing platform can overcome the adoption barriers. The Living Lab research was conducted in 2016 and combined quantitative and qualitative end-user and market research. Concretely, extensive desk research was conducted and combined with insights from expert interviews with four consultants active in the Belgian logistics sector and in-depth interviews with logistics professionals working for shippers (N=10) and LSP’s (N=3). In the article, we present findings which show that there are several factors slowing down the uptake of bundling transport flows. Shippers are hesitant to change how they currently work and they are hesitant to work together with other shippers. Moreover, several practical challenges impede shippers to work together. We also present some opportunities that can accelerate the adoption of bundling of transport flows. First, it seems that there is not enough support coming from governmental and commercial organizations. Secondly, there is the chicken and the egg problem: too few interested parties will lead to no or very few matching lanes. Shippers are therefore reluctant to partake in these projects because the benefits have not yet been proven. Thirdly, the incentive is not big enough for shippers. Road transport organized by the shipper individually is still seen as the easiest and cheapest solution. A solution for the abovementioned challenges might be found in the online data sharing platform of the Belgian company. The added value of this platform is showing shippers possible matching lanes, without the shippers having to invest time in negotiating and networking with other shippers and running the risk of not finding a match. The interviewed shippers and experts indicated that the online data sharing platform is a very promising concept which could accelerate the uptake of bundling of transport flows.

Keywords: adoption barriers, bundling of transport, shippers, transport optimization

Procedia PDF Downloads 190
1171 Optimization of Bills Assignment to Different Skill-Levels of Data Entry Operators in a Business Process Outsourcing Industry

Authors: M. S. Maglasang, S. O. Palacio, L. P. Ogdoc

Abstract:

Business Process Outsourcing has been one of the fastest growing and emerging industry in the Philippines today. Unlike most of the contact service centers, more popularly known as "call centers", The BPO Industry’s primary outsourced service is performing audits of the global clients' logistics. As a service industry, manpower is considered as the most important yet the most expensive resource in the company. Because of this, there is a need to maximize the human resources so people are effectively and efficiently utilized. The main purpose of the study is to optimize the current manpower resources through effective distribution and assignment of different types of bills to the different skill-level of data entry operators. The assignment model parameters include the average observed time matrix gathered from through time study, which incorporates the learning curve concept. Subsequently, a simulation model was made to duplicate the arrival rate of demand which includes the different batches and types of bill per day. Next, a mathematical linear programming model was formulated. Its objective is to minimize direct labor cost per bill by allocating the different types of bills to the different skill-levels of operators. Finally, a hypothesis test was done to validate the model, comparing the actual and simulated results. The analysis of results revealed that the there’s low utilization of effective capacity because of its failure to determine the product-mix, skill-mix, and simulated demand as model parameters. Moreover, failure to consider the effects of learning curve leads to overestimation of labor needs. From 107 current number of operators, the proposed model gives a result of 79 operators. This results to an increase of utilization of effective capacity to 14.94%. It is recommended that the excess 28 operators would be reallocated to the other areas of the department. Finally, a manpower capacity planning model is also recommended in support to management’s decisions on what to do when the current capacity would reach its limit with the expected increasing demand.

Keywords: optimization modelling, linear programming, simulation, time and motion study, capacity planning

Procedia PDF Downloads 499