Search results for: semantic action representation
362 Biostimulant Activity of Chitooligomers: Effect of Different Degrees of Acetylation and Polymerization on Wheat Seedlings under Salt Stress
Authors: Xiaoqian Zhang, Ping Zou, Pengcheng Li
Abstract:
Salt stress is one of the most serious abiotic stresses, and it can lead to the reduction of agricultural productivity. High salt concentration makes it more difficult for roots to absorb water and disturbs the homeostasis of cellular ions resulting in osmotic stress, ion toxicity and generation of reactive oxygen species (ROS). Compared with the normal physiological conditions, salt stress could inhibit the photosynthesis, break metabolic balance and damage cellular structures, and ultimately results in the reduction of crop yield. Therefore it is vital to develop practical methods for improving the salt tolerance of plants. Chitooligomers (COS) is partially depolymerized products of chitosan, which is consisted of D-glucosamine and N-acetyl-D-glucosamine. In agriculture, COS has the ability to promote plant growth and induce plant innate immunity. The bioactivity of COS closely related to its degree of polymerization (DP) and acetylation (DA). However, most of the previous reports fail to mention the function of COS with different DP and DAs in improving the capacity of plants against salt stress. Accordingly, in this study, chitooligomers (COS) with different degrees of DAs were used to test wheat seedlings response to salt stress. In addition, the determined degrees of polymerization (DPs) COS(DP 4-12) and a heterogeneous COS mixture were applied to explore the relationship between the DP of COSs and its effect on the growth of wheat seedlings in response to salt stress. It showed that COSs, the exogenous elicitor, could promote the growth of wheat seedling, reduce the malondialdehyde (MDA) concentration, and increase the activities of antioxidant enzymes. The results of mRNA expression level test for salt stress-responsive genes indicated that COS keep plants away from being hurt by the salt stress via the regulation of the concentration and the increased antioxidant enzymes activities. Moreover, it was found that the activities of COS was closely related to its Das and COS (DA: 50%) displayed the best salt resistance activity to wheat seedlings. The results also showed that COS with different DP could promote the growth of wheat seedlings under salt stress. COS with a DP (6-8) showed better activities than the other tested samples, implied its activity had a close relationship with its DP. After treatment with chitohexaose, chitoheptaose, and chitooctaose, the photosynthetic parameters were improved obviously. The soluble sugar and proline contents were improved by 26.7%-53.3% and 43.6.0%-70.2%, respectively, while the concentration of malondialdehyde (MDA) was reduced by 36.8% - 49.6%. In addition, the antioxidant enzymes activities were clearly activated. At the molecular level, the results revealed that they could obviously induce the expression of Na+/H+ antiporter genes. In general, these results were fundamental to the study of action mechanism of COS on promoting plant growth under salt stress and the preparation of plant growth regulator.Keywords: chitooligomers (COS), degree of polymerization (DP), degree of acetylation (DA), salt stress
Procedia PDF Downloads 177361 A Paradigm Shift into the Primary Teacher Education Program in Bangladesh
Authors: Happy Kumar Das, Md. Shahriar Shafiq
Abstract:
This paper portrays an assumed change in the primary teacher education program in Bangladesh. An initiative has been taken with a vision to ensure an integrated approach to developing trainee teachers’ knowledge and understanding about learning at a deeper level, and with that aim, the Diploma in Primary Education (DPEd) program replaces the Certificate-in-Education (C-in-Ed) program in Bangladeshi context for primary teachers. The stated professional values of the existing program such as ‘learner-centered’, ‘reflective’ approach to pedagogy tend to contradict the practice exemplified through the delivery mechanism. To address the challenges, through the main two components (i) Training Institute-based learning and (ii) School-based learning, the new program tends to cover knowledge and value that underpin the actual practice of teaching. These two components are given approximately equal weighting within the program in terms of both time, content and assessment as the integration seeks to combine theoretical knowledge with practical knowledge and vice versa. The curriculum emphasizes a balance between the taught modules and the components of the practicum. For example, the theories of formative and summative assessment techniques are elaborated through focused reflection on case studies as well as observation and teaching practice in the classroom. The key ideology that is reflected through this newly developed program is teacher’s belief in ‘holistic education’ that can lead to creating opportunities for skills development in all three (Cognitive, Social and Affective) domains simultaneously. The proposed teacher education program aims to address these areas of generic skill development alongside subject-specific learning outcomes. An exploratory study has been designed in this regard where 7 Primary Teachers’ Training Institutes (PTIs) in 7 divisions of Bangladesh was used for experimenting DPEd program. The analysis was done based on document analysis, periodical monitoring report and empirical data gathered from the experimental PTIs. The findings of the study revealed that the intervention brought positive change in teachers’ professional beliefs, attitude and skills along with improvement of school environment. Teachers in training schools work together for collective professional development where they support each other through lesson study, action research, reflective journals, group sharing and so on. Although the DPEd program addresses the above mentioned factors, one of the challenges of the proposed program is the issue of existing capacity and capabilities of the PTIs towards its effective implementation.Keywords: Bangladesh, effective implementation, primary teacher education, reflective approach
Procedia PDF Downloads 219360 Amniotic Fluid Stem Cells Ameliorate Cisplatin-Induced Acute Renal Failure through Autophagy Induction and Inhibition of Apoptosis
Authors: Soniya Nityanand, Ekta Minocha, Manali Jain, Rohit Anthony Sinha, Chandra Prakash Chaturvedi
Abstract:
Amniotic fluid stem cells (AFSC) have been shown to contribute towards the amelioration of Acute Renal Failure (ARF), but the mechanisms underlying the renoprotective effect are largely unknown. Therefore, the main goal of the current study was to evaluate the therapeutic efficacy of AFSC in a cisplatin-induced rat model of ARF and to investigate the underlying mechanisms responsible for its renoprotective effect. To study the therapeutic efficacy of AFSC, ARF was induced in Wistar rats by an intra-peritoneal injection of cisplatin, and five days after administration, the rats were randomized into two groups and injected with either AFSC or normal saline intravenously. On day 8 and 12 after cisplatin injection, i.e., day 3 and day7 post-therapy respectively, the blood biochemical parameters, histopathological changes, apoptosis and expression of pro-apoptotic, anti-apoptotic and autophagy-related proteins in renal tissues were studied in both groups of rats. Administration of AFSC in ARF rats resulted in improvement of renal function and attenuation of renal damage as reflected by significant decrease in blood urea nitrogen, serum creatinine levels, tubular cell apoptosis as assessed by Bax/Bcl2 ratio, and expression of the pro-apoptotic proteins viz. PUMA, Bax, cleaved caspase-3 and cleaved caspase-9 as compared to saline-treated group. Furthermore, in the AFSC-treated group as compared to saline-treated group, there was a significant increase in the activation of autophagy as evident by increased expression of LC3-II, ATG5, ATG7, Beclin1 and phospho-AMPK levels with a concomitant decrease in phospho-p70S6K and p62 expression levels. To further confirm whether the protective effects of AFSC on cisplatin-induced apoptosis were dependent on autophagy, chloroquine, an autophagy inhibitor was administered by the intra-peritoneal route. Chloroquine administration led to significant reduction in the anti-apoptotic effects of the AFSC therapy and further deterioration in the renal structure and function caused by cisplatin. Collectively, our results put forth that AFSC ameliorates cisplatin-induced ARF through induction of autophagy and inhibition of apoptosis. Furthermore, the protective effects of AFSC were blunted by chloroquine, highlighting that activation of autophagy is an important mechanism of action for the protective role of AFSC in cisplatin-induced renal injury.Keywords: amniotic fluid stem cells, acute renal failure, autophagy, cisplatin
Procedia PDF Downloads 107359 Comparison of Verb Complementation Patterns in Selected Pakistani and British English Newspaper Social Columns: A Corpus-Based Study
Authors: Zafar Iqbal Bhatti
Abstract:
The present research aims to examine and evaluate the frequencies and practices of verb complementation patterns in English newspaper social columns published in Pakistan and Britain. The research will demonstrate that Pakistani English is a non-native variety of English having its own unique usual and logical characteristics, affected by way of the native languages and the culture, upon syntactic levels, making the variety users aware that any differences from British or American English that are systematic and regular, or another English language, are not even if they are unique, erroneous forms and typical characteristics of several kinds. The objectives are to examine the verb complementation patterns that British and Pakistani social columnists use in relation to their syntactic categories. Secondly, to compare the verb complementation patterns used in Pakistani and British English newspapers social columns. This study will figure out various verb complementation patterns in Pakistani and British English newspaper social columns and their occurrence and distribution. The word classes express different functions of words, such as action, event, or state of being. This research aims to evaluate whether there are any appreciable differences in the verb complementation patterns used in Pakistani and British English newspaper social columns. The results will show the number of varieties of verb complementation patterns in selected English newspapers social columns. This study will fill the gap of previous studies conducted in this field as they only explore a little about the differences between Pakistani and British English newspapers. It will also figure out a variety of languages used in Pakistani and British English journals, as well as regional and cultural values and variations. The researcher will use AntConc software in this study to extract the data for analysis. The researcher will use a concordance tool to identify verb complementation patterns in selected data. Then the researcher will manually categorize them because the same type of adverb can sometimes be used for various purposes. From 1st June 2022 to 30th Sep. 2022, a four-month written corpus of the social columns of PE and BE newspapers will be collected and analyzed. For the analysis of the research questions, 50 social columns will be selected from Pakistani newspapers and 50 from British newspapers. The researcher will collect a representative sample of data from Pakistani and British English newspaper social columns. The researcher will manually analyze the complementation patterns of each verb in each sentence, and then the researcher will determine how frequently each pattern occurs. The researcher will use syntactic characteristics of the verb complementation elements according to the description by Downing and Locke (2006). The researcher will examine all of the verb complementation patterns in the data, and the frequency and distribution of each verb complementation pattern will be evaluated using the software. The researcher will explore every possible verb complementation pattern in Pakistani and British English before calculating the occurrence and abundance of each verb pattern. The researcher will explore every possible verb complementation pattern in Pakistani English before calculating the frequency and distribution of each pattern.Keywords: verb complementation, syntactic categories, newspaper social columns, corpus
Procedia PDF Downloads 55358 A Digital Environment for Developing Mathematical Abilities in Children with Autism Spectrum Disorder
Authors: M. Isabel Santos, Ana Breda, Ana Margarida Almeida
Abstract:
Research on academic abilities of individuals with autism spectrum disorder (ASD) underlines the importance of mathematics interventions. Yet the proposal of digital applications for children and youth with ASD continues to attract little attention, namely, regarding the development of mathematical reasoning, being the use of the digital technologies an area of great interest for individuals with this disorder and its use is certainly a facilitative strategy in the development of their mathematical abilities. The use of digital technologies can be an effective way to create innovative learning opportunities to these students and to develop creative, personalized and constructive environments, where they can develop differentiated abilities. The children with ASD often respond well to learning activities involving information presented visually. In this context, we present the digital Learning Environment on Mathematics for Autistic children (LEMA) that was a research project conducive to a PhD in Multimedia in Education and was developed by the Thematic Line Geometrix, located in the Department of Mathematics, in a collaboration effort with DigiMedia Research Center, of the Department of Communication and Art (University of Aveiro, Portugal). LEMA is a digital mathematical learning environment which activities are dynamically adapted to the user’s profile, towards the development of mathematical abilities of children aged 6–12 years diagnosed with ASD. LEMA has already been evaluated with end-users (both students and teacher’s experts) and based on the analysis of the collected data readjustments were made, enabling the continuous improvement of the prototype, namely considering the integration of universal design for learning (UDL) approaches, which are of most importance in ASD, due to its heterogeneity. The learning strategies incorporated in LEMA are: (i) provide options to custom choice of math activities, according to user’s profile; (ii) integrates simple interfaces with few elements, presenting only the features and content needed for the ongoing task; (iii) uses a simple visual and textual language; (iv) uses of different types of feedbacks (auditory, visual, positive/negative reinforcement, hints with helpful instructions including math concept definitions, solved math activities using split and easier tasks and, finally, the use of videos/animations that show a solution to the proposed activity); (v) provides information in multiple representation, such as text, video, audio and image for better content and vocabulary understanding in order to stimulate, motivate and engage users to mathematical learning, also helping users to focus on content; (vi) avoids using elements that distract or interfere with focus and attention; (vii) provides clear instructions and orientation about tasks to ease the user understanding of the content and the content language, in order to stimulate, motivate and engage the user; and (viii) uses buttons, familiarly icons and contrast between font and background. Since these children may experience little sensory tolerance and may have an impaired motor skill, besides the user to have the possibility to interact with LEMA through the mouse (point and click with a single button), the user has the possibility to interact with LEMA through Kinect device (using simple gesture moves).Keywords: autism spectrum disorder, digital technologies, inclusion, mathematical abilities, mathematical learning activities
Procedia PDF Downloads 119357 The Lopsided Burden of Non-Communicable Diseases in India: Evidences from the Decade 2004-2014
Authors: Kajori Banerjee, Laxmi Kant Dwivedi
Abstract:
India is a part of the ongoing globalization, contemporary convergence, industrialization and technical advancement that is taking place world-wide. Some of the manifestations of this evolution is rapid demographic, socio-economic, epidemiological and health transition. There has been a considerable increase in non-communicable diseases due to change in lifestyle. This study aims to assess the direction of burden of disease and compare the pressure of infectious diseases against cardio-vascular, endocrine, metabolic and nutritional diseases. The change in prevalence in a ten-year period (2004-2014) is further decomposed to determine the net contribution of various socio-economic and demographic covariates. The present study uses the recent 71st (2014) and 60th (2004) rounds of National Sample Survey. The pressure of infectious diseases against cardio-vascular (CVD), endocrine, metabolic and nutritional (EMN) diseases during 2004-2014 is calculated by Prevalence Rates (PR), Hospitalization Rates (HR) and Case Fatality Rates (CFR). The prevalence of non-communicable diseases are further used as a dependent variable in a logit regression to find the effect of various social, economic and demographic factors on the chances of suffering from the particular disease. Multivariate decomposition technique further assists in determining the net contribution of socio-economic and demographic covariates. This paper upholds evidences of stagnation of the burden of communicable diseases (CD) and rapid increase in the burden of non-communicable diseases (NCD) uniformly for all population sub-groups in India. CFR for CVD has increased drastically in 2004-2014. Logit regression indicates the chances of suffering from CVD and EMN is significantly higher among the urban residents, older ages, females, widowed/ divorced and separated individuals. Decomposition displays ample proof that improvement in quality of life markers like education, urbanization, longevity of life has positively contributed in increasing the NCD prevalence rate. In India’s current epidemiological phase, compression theory of morbidity is in action as a significant rise in the probability of contracting the NCDs over the time period among older ages is observed. Age is found to play a vital contributor in increasing the probability of having CVD and EMN over the study decade 2004-2014 in the nationally representative sample of National Sample Survey.Keywords: cardio-vascular disease, case-fatality rate, communicable diseases, hospitalization rate, multivariate decomposition, non-communicable diseases, prevalence rate
Procedia PDF Downloads 318356 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading
Authors: Jerome Joshi
Abstract:
The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus
Procedia PDF Downloads 86355 Technology, Ethics and Experience: Understanding Interactions as Ethical Practice
Authors: Joan Casas-Roma
Abstract:
Technology has become one of the main channels through which people engage in most of their everyday activities; from working to learning, or even when socializing, technology often acts as both an enabler and a mediator of such activities. Moreover, the affordances and interactions created by those technological tools determine the way in which the users interact with one another, as well as how they relate to the relevant environment, thus favoring certain kinds of actions and behaviors while discouraging others. In this regard, virtue ethics theories place a strong focus on a person's daily practice (understood as their decisions, actions, and behaviors) as the means to develop and enhance their habits and ethical competences --such as their awareness and sensitivity towards certain ethically-desirable principles. Under this understanding of ethics, this set of technologically-enabled affordances and interactions can be seen as the possibility space where the daily practice of their users takes place in a wide plethora of contexts and situations. At this point, the following question pops into mind: could these affordances and interactions be shaped in a way that would promote behaviors and habits basedonethically-desirable principles into their users? In the field of game design, the MDA framework (which stands for Mechanics, Dynamics, Aesthetics) explores how the interactions enabled within the possibility space of a game can lead to creating certain experiences and provoking specific reactions to the players. In this sense, these interactions can be shaped in ways thatcreate experiences to raise the players' awareness and sensitivity towards certain topics or principles. This research brings together the notions of technological affordances, the notions of practice and practical wisdom from virtue ethics, and the MDA framework from game design in order to explore how the possibility space created by technological interactions can be shaped in ways that enable and promote actions and behaviors supporting certain ethically-desirable principles. When shaped accordingly, interactions supporting certain ethically-desirable principlescould allow their users to carry out the kind of practice that, according to virtue ethics theories, provides the grounds to develop and enhance their awareness, sensitivity, and ethical reasoning capabilities. Moreover, and because ethical practice can happen collaterally in almost every context, decision, and action, this additional layer could potentially be applied in a wide variety of technological tools, contexts, and functionalities. This work explores the theoretical background, as well as the initial considerations and steps that would be needed in order to harness the potential ethically-desirable benefits that technology can bring, once it is understood as the space where most of their users' daily practice takes place.Keywords: ethics, design methodology, human-computer interaction, philosophy of technology
Procedia PDF Downloads 163354 Developing Pedagogy for Argumentation and Teacher Agency: An Educational Design Study in the UK
Authors: Zeynep Guler
Abstract:
Argumentation and the production of scientific arguments are essential components that are necessary for helping students become scientifically literate through engaging them in constructing and critiquing ideas. Incorporating argumentation into science classrooms is challenging and can be a long-term process for both students and teachers. Students have difficulty in engaging tasks that require them to craft arguments, evaluate them to seek weaknesses, and revise them. Teachers also struggle with facilitating argumentation when they have underdeveloped science practices, underdeveloped pedagogical knowledge for argumentation science teaching, or underdeveloped teaching practice with argumentation (or a combination of all three). Thus, there is a need to support teachers in developing pedagogy for science teaching as argumentation, planning and implementing teaching practice for facilitating argumentation and also in becoming more agentic in this regards. Looking specifically at the experience of agency within education, it is arguable that agency is necessary for teachers’ renegotiation of professional purposes and practices in the light of changing educational practices. This study investigated how science teachers develop pedagogy for argumentation both individually and with their colleagues and also how teachers become more agentic (or not) through the active engagement of their contexts-for-action that refer to this as an ecological understanding of agency in order to positively influence or change their practice and their students' engagement with argumentation over two academic years. Through educational design study, this study conducted with three secondary science teachers (key stage 3-year 7 students aged 11-12) in the UK to find out if similar or different patterns of developing pedagogy for argumentation and of becoming more agentic emerge as they engage in planning and implementing a cycle of activities during the practice of teaching science with argumentation. Data from video and audio-recording of classroom practice and open-ended interviews with the science teachers were analysed using content analysis. The findings indicated that all the science teachers perceived strong agency in their opportunities to develop and apply pedagogical practices within the classroom. The teachers were pro-actively shaping their practices and classroom contexts in ways that were over and above the amendments to their pedagogy. They demonstrated some outcomes in developing pedagogy for argumentation and becoming more agentic in their teaching in this regards as a result of the collaboration with their colleagues and researcher; some appeared more agentic than others. The role of the collaboration between their colleagues was seen crucial for the teachers’ practice in the schools: close collaboration and support from other teachers in planning and implementing new educational innovations were seen as crucial for the development of pedagogy and becoming more agentic in practice. They needed to understand the importance of scientific argumentation but also understand how it can be planned and integrated into classroom practice. They also perceived constraint emerged from their lack of competence and knowledge in posing appropriate questions to help the students engage in argumentation, providing support for the students' construction of oral and written arguments.Keywords: argumentation, teacher professional development, teacher agency, students' construction of argument
Procedia PDF Downloads 135353 Controlling the Release of Cyt C and L- Dopa from pNIPAM-AAc Nanogel Based Systems
Authors: Sulalit Bandyopadhyay, Muhammad Awais Ashfaq Alvi, Anuvansh Sharma, Wilhelm R. Glomm
Abstract:
Release of drugs from nanogels and nanogel-based systems can occur under the influence of external stimuli like temperature, pH, magnetic fields and so on. pNIPAm-AAc nanogels respond to the combined action of both temperature and pH, the former being mostly determined by hydrophilic-to-hydrophobic transitions above the volume phase transition temperature (VPTT), while the latter is controlled by the degree of protonation of the carboxylic acid groups. These nanogels based systems are promising candidates in the field of drug delivery. Combining nanogels with magneto-plasmonic nanoparticles (NPs) introduce imaging and targeting modalities along with stimuli-response in one hybrid system, thereby incorporating multifunctionality. Fe@Au core-shell NPs possess optical signature in the visible spectrum owing to localized surface plasmon resonance (LSPR) of the Au shell, and superparamagnetic properties stemming from the Fe core. Although there exist several synthesis methods to control the size and physico-chemical properties of pNIPAm-AAc nanogels, yet, there is no comprehensive study that highlights the dependence of incorporation of one or more layers of NPs to these nanogels. In addition, effective determination of volume phase transition temperature (VPTT) of the nanogels is a challenge which complicates their uses in biological applications. Here, we have modified the swelling-collapse properties of pNIPAm-AAc nanogels, by combining with Fe@Au NPs using different solution based methods. The hydrophilic-hydrophobic transition of the nanogels above the VPTT has been confirmed to be reversible. Further, an analytical method has been developed to deduce the average VPTT which is found to be 37.3°C for the nanogels and 39.3°C for nanogel coated Fe@Au NPs. An opposite swelling –collapse behaviour is observed for the latter where the Fe@Au NPs act as bridge molecules pulling together the gelling units. Thereafter, Cyt C, a model protein drug and L-Dopa, a drug used in the clinical treatment of Parkinson’s disease were loaded separately into the nanogels and nanogel coated Fe@Au NPs, using a modified breathing-in mechanism. This gave high loading and encapsulation efficiencies (L Dopa: ~9% and 70µg/mg of nanogels, Cyt C: ~30% and 10µg/mg of nanogels respectively for both the drugs. The release kinetics of L-Dopa, monitored using UV-vis spectrophotometry was observed to be rather slow (over several hours) with highest release happening under a combination of high temperature (above VPTT) and acidic conditions. However, the release of L-Dopa from nanogel coated Fe@Au NPs was the fastest, accounting for release of almost 87% of the initially loaded drug in ~30 hours. The chemical structure of the drug, drug incorporation method, location of the drug and presence of Fe@Au NPs largely alter the drug release mechanism and the kinetics of these nanogels and Fe@Au NPs coated with nanogels.Keywords: controlled release, nanogels, volume phase transition temperature, l-dopa
Procedia PDF Downloads 334352 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling
Authors: Vibha Devi, Shabina Khanam
Abstract:
Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation
Procedia PDF Downloads 143351 Mental Health Monitoring System as an Effort for Prevention and Handling of Psychological Problems in Students
Authors: Arif Tri Setyanto, Aditya Nanda Priyatama, Nugraha Arif Karyanta, Fadjri Kirana A., Afia Fitriani, Rini Setyowati, Moh.Abdul Hakim
Abstract:
The Basic Health Research Report by the Ministry of Health (2018) shows an increase in the prevalence of mental health disorders in the adolescent and early adult age ranges. Supporting this finding, data on the psychological examination of the student health service unit at one State University recorded 115 cases of moderate and severe health problems in the period 2016 - 2019. More specifically, the highest number of cases was experienced by clients in the age range of 21-23 years or equivalent, with the mid-semester stage towards the end. Based on the distribution of cases experienced and the disorder becomes a psychological problem experienced by students. A total of 29% or the equivalent of 33 students experienced anxiety disorders, 25% or 29 students experienced problems ranging from mild to severe, as well as other classifications of disorders experienced, including adjustment disorders, family problems, academics, mood disorders, self-concept disorders, personality disorders, cognitive disorders, and others such as trauma and sexual disorders. Various mental health disorders have a significant impact on the academic life of students, such as low GPA, exceeding the limit in college, dropping out, disruption of social life on campus, to suicide. Based on literature reviews and best practices from universities in various countries, one of the effective ways to prevent and treat student mental health disorders is to implement a mental health monitoring system in universities. This study uses a participatory action research approach, with a sample of 423 from a total population of 32,112 students. The scale used in this study is the Beck Depression Inventory (BDI) to measure depression and the Taylor Minnesota Anxiety Scale (TMAS) to measure anxiety levels. This study aims to (1) develop a digital-based health monitoring system for students' mental health situations in the mental health category. , dangers, or those who have mental disorders, especially indications of symptoms of depression and anxiety disorders, and (2) implementing a mental health monitoring system in universities at the beginning and end of each semester. The results of the analysis show that from 423 respondents, the main problems faced by all coursework, such as thesis and academic assignments. Based on the scoring and categorization of the Beck Depression Inventory (BDI), 191 students experienced symptoms of depression. A total of 24.35%, or 103 students experienced mild depression, 14.42% (61 students) had moderate depression, and 6.38% (27 students) experienced severe or extreme depression. Furthermore, as many as 80.38% (340 students) experienced anxiety in the high category. This article will review this review of the student mental health service system on campus.Keywords: monitoring system, mental health, psychological problems, students
Procedia PDF Downloads 114350 Impact of Individual and Neighborhood Social Capital on the Health Status of the Pregnant Women in Riyadh City, Saudi Arabia
Authors: Abrar Almutairi, Alyaa Farouk, Amal Gouda
Abstract:
Background: Social capital is a factor that helps in bonding in a social network. The individual and the neighborhood social capital affect the health status of members of a particular society. In addition, to the influence of social health on the health of the population, social health has a significant effect on women, especially those with pregnancy. Study objective was to assess the impact of the social capital on the health status of pregnant women Design: A descriptive crosssectional correlational design was utilized in this study. Methods: A convenient sample of 210 pregnant women who attended the outpatient antenatal clinicsfor follow-up in King Fahad hospital (Ministry of National Guard Health Affairs/Riyadh) and King Abdullah bin Abdelaziz University Hospital (KAAUH, Ministry of Education /Riyadh) were included in the study. Data was collected using a self-administered questionnaire that was developed by the researchers based on the “World Bank Social Capital Assessment Tool” and SF-36 questionnaire (Short Form Health Survey). The questionnaire consists of 4 parts to collect information regarding socio-demographic data, obstetric and gynecological history, general scale of health status and social activity during pregnancy and the social capital of the study participants, with different types of questions such as multiple-choice questions, polar questions, and Likert scales. Data analysis was carried out by using Statistical Package for the Social Sciences version 23. Descriptive statistic as frequency, percentage, mean, and standard deviation was used to describe the sample characteristics, and the simple linear regression test was used to assess the relationship between the different variables, with level of significance P≤0.005. Result: This study revealed that only 31.1% of the study participants perceived that they have good general health status. About two thirds (62.8%) of the participants have moderate social capital, more than one ten (11.2٪) have high social capital and more than a quarter (26%) of them have low social capital. All dimensions of social capital except for empowerment and political action had positive significant correlations with the health status of pregnant women with P value ranging from 0.001 to 0.010in all dimensions. In general, the social capital showed high statistically significant association with the health status of the pregnant (P=0.002). Conclusion: Less than one third of the study participants had good perceived health status, and the majority of the study participants have moderate social capital, with only about one ten of them perceived that they have high social capital. Finally, neighborhood residency area, family size, sufficiency of income, past medical and surgical history and parity of the study participants were all significantly impacting the assessed health domains of the pregnant women.Keywords: impact, social capital, health status, pregnant women
Procedia PDF Downloads 62349 The International Fight against the Financing of Terrorism: Analysis of the Anti-Money Laundering and Combating Financing of Terrorism Regime
Authors: Loukou Amoin Marie Djedri
Abstract:
Financing is important for all terrorists – from the largest organizations in control of territories, to the smallest groups – not only for spreading fear through attacks, but also to finance the expansion of terrorist dogmas. These organizations pose serious threats to the international community. The disruption of terrorist financing aims to create a hostile environment for the growth of terrorism and to limit considerably the terrorist groups capacities. The World Bank (WB), together with the International Monetary Fund (IMF), decided to include in their scope the Fight against the money laundering and the financing of terrorism, in order to assist Member States in protecting their internal financial system from terrorism use and abuse and reinforcing their legal system. To do so, they have adopted the Anti-Money Laundering /Combating Financing of Terrorism (AML/CFT) standards that have been set up by the Financial Action Task Force. This set of standards, recognized as the international standards for anti-money laundering and combating the financing of terrorism, has to be implemented by States Members in order to strengthen their judicial system and relevant national institutions. However, we noted that, to date, some States Members still have significant AML/CFT deficiencies, which can constitute serious threats not only to the country’s economic stability but also for the global financial system. In addition, studies stressed out that repressive measures are more implemented by countries than preventive measures, which could be an important weakness in a state security system. Furthermore, we noticed that the AML/CFT standards evolve slowly, while techniques used by terrorist networks keep developing. The goal of the study is to show how to enhance the AML/CFT global compliance through the work of the IMF and the WB, to help member states to consolidate their financial system. To encourage and ensure the effectiveness of these standards, a methodology for assessing the compliance with the AML/CFT standards has been created to follow up the concrete implementation of these standards and to provide accurate technical assistance to countries in need. A risk-based approach has also been adopted as a key component of the implementation of the AML/CFT Standards, with the aim of strengthening the efficiency of the standards. Instead, we noted that the assessment is not efficient in the process of enhancing AML/CFT measures because it seems to lack of adaptation to the country situation. In other words, internal and external factors are not enough taken into account in a country assessment program. The purpose of this paper is to analyze the AML/CFT regime in the fight against the financing of terrorism and to find lasting solutions to achieve the global AML/CFT compliance. The work of all the organizations involved in this combat is imperative to protect the financial network and to lead to the disintegration of terrorist groups in the future.Keywords: AML/CFT standards, financing of terrorism, international financial institutions, risk-based approach
Procedia PDF Downloads 282348 Global Winners versus Local Losers: Globalization Identity and Tradition in Spanish Club Football
Authors: Jim O'brien
Abstract:
Contemporary global representation and consumption of La Liga across a plethora of media platform outlets has resulted in significant implications for the historical, political and cultural developments which shaped the development of Spanish club football. This has established and reinforced a hierarchy of a small number of teams belonging to or aspiring to belong to a cluster of global elite clubs seeking to imitate the blueprint of the English Premier League in respect of corporate branding and marketing in order to secure a global fan base through success and exposure in La Liga itself and through the Champions League. The synthesis between globalization, global sport and the status of high profile clubs has created radical change within the folkloric iconography of Spanish football. The main focus of this paper is to critically evaluate the consequences of globalization on the rich tapestry at the core of the game’s distinctive history in Spain. The seminal debate underpinning the study considers whether the divergent aspects of globalization have acted as a malevolent force, eroding tradition, causing financial meltdown and reducing much of the fabric of club football to the status of by standers, or have promoted a renaissance of these traditions, securing their legacies through new fans and audiences. The study draws on extensive sources on the history, politics and culture of Spanish football, in both English and Spanish. It also uses primary and archive material derived from interviews and fieldwork undertaken with scholars, media professionals and club representatives in Spain. The paper has four main themes. Firstly, it contextualizes the key historical, political and cultural forces which shaped the landscape of Spanish football from the late nineteenth century. The seminal notions of region, locality and cultural divergence are pivotal to this discourse. The study then considers the relationship between football, ethnicity and identity as a barometer of continuity and change, suggesting that tradition is being reinvented and re-framed to reflect the shifting demographic and societal patterns within the Spanish state. Following on from this, consideration is given to the paradoxical function of ‘El Clasico’ and the dominant duopoly of the FC Barcelona – Real Madrid axis in both eroding tradition in the global nexus of football’s commodification and in protecting historic political rivalries. To most global consumers of La Liga, the mega- spectacle and hyperbole of ‘El Clasico’ is the essence of Spanish football, with cultural misrepresentation and distortion catapulting the event to the global media audience. Finally, the paper examines La Liga as a sporting phenomenon in which elite clubs, cult managers and galacticos serve as commodities on the altar of mass consumption in football’s global entertainment matrix. These processes accentuate a homogenous mosaic of cultural conformity which obscures local, regional and national identities and paradoxically fuses the global with the local to maintain the distinctive hue of La Liga, as witnessed by the extraordinary successes of Athletico Madrid and FC Eibar in recent seasons.Keywords: Spanish football, globalization, cultural identity, tradition, folklore
Procedia PDF Downloads 307347 A Survey Study Exploring Principal Leadership and Teachers’ Expectations in the Social Working Life of Two Swedish Schools
Authors: Anette Forssten Seiser, Ulf Blossing, Mats Ekholm
Abstract:
The expectation on principals to manage, lead and develop their schools and teachers are high. However, principals are not left alone without guidelines. Policy texts, curricula and syllabuses guide the orientation of their leadership. Moreover, principals’ traits and experience as well as professional norms, are decisive. However, in this study we argue for the importance to deepen the knowledge of how the practice of leadership is shaped in the daily social working life with the teachers at the school. Teachers’ experiences and expectations of leadership influence the principal’s actions, sometimes perhaps contrary to what is emphasized in official texts like the central guidelines. The expectations of teachers make up the norms of the school and thus constitute the local school culture. The aim of this study is to deepen the knowledge of teachers’ expectations on their principals to manage, lead and develop their schools. Two questions are used to guide the study: 1) How do teachers’ and principals’ expectations differ in realistic situations? 2) How do teachers’ experience-based expectations differ from more ideal expectations? To investigate teachers’ expectations of their principals, we use a social psychological perspective framed within an organisational development perspective. A social role is defined by the fact that, within the framework of the role, different people who fulfil the same role exhibit greater similarities than differences in their actions. The way a social role is exercised depends on the expectations placed on the role’s position but also on the expectations of the function of the role. The way in which the social role is embodied in practice also depends on how the person fulfilling the role perceives and understands those expectations. Based on interviews with school principals a questionnaire was constructed. Nine possible real-life and critical incidents were described that are important when it comes to role shaping in the dynamics between teachers and principals. Teachers were asked to make a choice between three, four, or five possible and realistic courses of action for the principal. The teachers were also asked to make two choices between these different options in real-life situations, one ideal as if they were working as a principal themselves, and one experience based – how they estimated that their own principal would act in such a situation. The sample consist of two elementary schools in Sweden. School A consists of two principals and 38 teachers and school B of two principals and 22 teachers. The response rate among the teachers is 95 percent in school A and 86 percent in school B. All four principals answered our questions. The results show that the expectations of teachers and principals can be understood as variations of being harmonic or disharmonic. The harmonic expectations can be interpreted to lead to an attuned leadership, while the disharmonic expectations lead to a more tensed leadership. Harmonious expectations and an attuned leadership are prominent. The results are compared to earlier research on leadership. Attuned and more tensed leadership are discussed in relation to school development and future research.Keywords: critical incidents, principal leadership, school culture, school development, teachers' expectations
Procedia PDF Downloads 98346 The Impact of Sensory Overload on Students on the Autism Spectrum in Italian Inclusive Classrooms: Teachers' Perspectives and Training Needs
Authors: Paola Molteni, Luigi d’Alonzo
Abstract:
Background: Sensory issues are now considered one of the key aspects in defining and diagnosing autism, changing the perspectives on behavioural analysis and intervention in mainstream educational services. However, Italian teachers’ training is yet not specific on the topic of autism and its sensory-related effects and this research investigates the teacher’s capability in understanding the student’s needs and his/her challenging behaviours considering sensory perceptions. Objectives: The research aims to analyse mainstream schools teachers’ awareness on students’ sensory perceptions and how this affects classroom inclusion and learning process. The research questions are: i) Are teachers able to identify student’s sensory issues?; ii) Are trained teachers more able to identify sensory problems then untrained ones?; iii) What is the impact of sensory issues on inclusion in mainstream classrooms?; iv) What should teachers know about autistic sensory dimensions? Methods: This research was designed as a pilot study that involves a multi-methods approach, including action and collaborative research methodology. The designed research allows the researcher to catch the complexity of a province school district (from kindergarten to high school) through a deep detailed analysis of selected aspects. The researcher explored the questions described above through 133 questionnaires and 6 focus groups. The qualitative and quantitative data collected during the research were analysed using the Interpretative Phenomenological Analysis (IPA). Results: Mainstream schools teachers are not able to confidently recognise sensory issues of children included in the classroom. The research underlines: how professionals with no specific training on autism are not able to recognise sensory problems in students on the spectrum; how hearing and sight issues have higher impact on classroom inclusion and student’s learning process; how a lack of understanding is often followed by misinterpretations of the impact of sensory issues and challenging behaviours. Conclusions: As this research has shown, promoting and enhancing the importance of understanding sensory issues related to autism is fundamental to enable mainstream schools teachers to define educational and life-long plans able to properly answer the student’s needs and support his/her real inclusion in the classroom. This study is a good example of how the educational research can meet and help the daily practice in working with people on the autism spectrum and support the training design for mainstream school teachers: the emerging need of designed preparation on sensory issues is fundamental to be considered when planning school district in-service training programmes, specifically declined for inclusive services.Keywords: autism spectrum condition, scholastic inclusion, sensory overload, teacher's training
Procedia PDF Downloads 321345 Validation of an Acuity Measurement Tool for Maternity Services
Authors: Cherrie Lowe
Abstract:
The TrendCare Patient Dependency System is currently utilized by a large number of Maternity Services across Australia, New Zealand and Singapore. In 2012, 2013, and 2014 validation studies were initiated in all three countries to validate the acuity tools used for Women in Labour, and Postnatal Mothers and Babies. This paper will present the findings of the validation study. Aim: The aim of this study was to; Identify if the care hours provided by the TrendCare Acuity System was an accurate reflection of the care required by Women and Babies. Obtain evidence of changes required to acuity indicators and/or category timings to ensure the TrendCare acuity system remains reliable and valid across a range of Maternity care models in three countries. Method: A non-experimental action research methodology was used across four District Health Boards in New Zealand, two large public Australian Maternity services and a large tertiary Maternity service in Singapore. Standardized data collection forms and timing devices were used to collect Midwife contact times with Women and Babies included in the study. Rejection processes excluded samples where care was not completed/rationed. The variances between actual timed Midwife/Mother/Baby contact and actual Trend Care acuity times were identified and investigated. Results: 87.5% (18) of TrendCare acuity category timings matched the actual timings recorded for Midwifery care. 12.5% (3) of TrendCare night duty categories provided less minutes of care than the actual timings. 100% of Labour Ward TrendCare categories matched actual timings for Midwifery care. The actual times given for assistance to New Zealand independent Midwives in Labour Ward showed a significant deviation to previous studies demonstrating the need for additional time allocations in Trend Care. Conclusion: The results demonstrated the importance of regularly validating the Trend Care category timings with the care hours required, as variances to models of care and length of stay in Maternity units have increased Midwifery workloads on the night shift. The level of assistance provided by the core labour ward staff to the Independent Midwife has increased substantially. Outcomes: As a consequence of this study changes were made to the night duty TrendCare Maternity categories, additional acuity indicators developed and times for assisting independent Midwives increased. The updated TrendCare version was delivered to Maternity services in 2014.Keywords: maternity, acuity, research, nursing workloads
Procedia PDF Downloads 381344 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products
Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola
Abstract:
The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.Keywords: decision making, design euristics, product design, product design process, design paradigms
Procedia PDF Downloads 121343 Safety Validation of Black-Box Autonomous Systems: A Multi-Fidelity Reinforcement Learning Approach
Authors: Jared Beard, Ali Baheri
Abstract:
As autonomous systems become more prominent in society, ensuring their safe application becomes increasingly important. This is clearly demonstrated with autonomous cars traveling through a crowded city or robots traversing a warehouse with heavy equipment. Human environments can be complex, having high dimensional state and action spaces. This gives rise to two problems. One being that analytic solutions may not be possible. The other is that in simulation based approaches, searching the entirety of the problem space could be computationally intractable, ruling out formal methods. To overcome this, approximate solutions may seek to find failures or estimate their likelihood of occurrence. One such approach is adaptive stress testing (AST) which uses reinforcement learning to induce failures in the system. The premise of which is that a learned model can be used to help find new failure scenarios, making better use of simulations. In spite of these failures AST fails to find particularly sparse failures and can be inclined to find similar solutions to those found previously. To help overcome this, multi-fidelity learning can be used to alleviate this overuse of information. That is, information in lower fidelity can simulations can be used to build up samples less expensively, and more effectively cover the solution space to find a broader set of failures. Recent work in multi-fidelity learning has passed information bidirectionally using “knows what it knows” (KWIK) reinforcement learners to minimize the number of samples in high fidelity simulators (thereby reducing computation time and load). The contribution of this work, then, is development of the bidirectional multi-fidelity AST framework. Such an algorithm, uses multi-fidelity KWIK learners in an adversarial context to find failure modes. Thus far, a KWIK learner has been used to train an adversary in a grid world to prevent an agent from reaching its goal; thus demonstrating the utility of KWIK learners in an AST framework. The next step is implementation of the bidirectional multi-fidelity AST framework described. Testing will be conducted in a grid world containing an agent attempting to reach a goal position and adversary tasked with intercepting the agent as demonstrated previously. Fidelities will be modified by adjusting the size of a time-step, with higher-fidelity effectively allowing for more responsive closed loop feedback. Results will compare the single KWIK AST learner with the multi-fidelity algorithm with respect to number of samples, distinct failure modes found, and relative effect of learning after a number of trials.Keywords: multi-fidelity reinforcement learning, multi-fidelity simulation, safety validation, falsification
Procedia PDF Downloads 161342 Neighborhood Relations in a Context of Cultural and Social Diversity - Qualitative Analysis of a Case Study in a Territory in the inner City of Lisbon
Authors: Madalena Corte-real, João Pedro Nunes, Bernardo Fernandes, Ana Jorge Correira
Abstract:
This presentation looks, from a sociological perspective, at neighboring practices in the inner city of Lisbon. The capital of Portugal, with half a million inhabitants, inserted in a metropolitan area with almost 2,9 million people, has been in the international spotlight seen as an interesting city to live in and to invest in, especially in the real estate market. This promotion emerged in the context of the financial crisis, where local authorities aimed to make Lisbon a more competitive city, calling for visitors and financial and human capital. Especially in the last decade, Portugal’s capital has been experiencing a significant increase in terms of migration from creative and entrepreneurial exiles to economic and political expats. In this context, the territory under analysis, in particular, is a mixed-used area undergoing rapid transformations in recent years marked by the presence of newcomers and non-nationals as well as social and cultural heterogeneity. It is next to one of the main arteries, considered the most multicultural part of the city, and presented in the press as one of the coolest neighborhoods in Europe. In view of these aspects, this research aims to address key-topics in current urban research: anonymity often related to big cities, socio-spatial attachment to the neighborhood, and the effects of diversity in the everyday relations of residents and shopkeepers. This case-study intends to look at particularities in local regimes differently affected by growing mobility. Against a backdrop of unidimensional generalizations and a tendency to refer to central countries and global cities, it aims to discuss national and local specificities. In methodological terms, the project comprises essentially a qualitative approach that consists of direct observation techniques and ethnographic methods as well semi-structured interviews to residents and local stakeholders whose narratives are subject to content analysis. The paper starts with a characterization of the broader context of the city of Lisbon, followed by territorial specificities regarding socio-spatial development, namely the city’s and the inner-areas morphology as well as the population’s socioeconomic profile. Following the residents and stakeholders’ narratives and practices it will assess the perception and behaviors regarding the representation of the area, relationships and experiences, routines, and sociability. Results point to a significant presence of neighborhood relations and different forms of support, in particular, among the different groups – e.g., old long-time residents, middle-class families, global creative class, and communities of economic migrants. Fieldwork reveals low levels of place-attachment although some residents refer, presently, high levels of satisfaction. Engagement with living space, this case-study suggests, reveals the social construction and lived the experience of neighboring by different groups, but also the way different and contrasting visions and desires are articulated to the profound urban, cultural and political changes that permeate the area.Keywords: diversity, lisbon, neighboring and neighborhood, place-attachment
Procedia PDF Downloads 114341 Effect of Fermented Orange Juice Intake on Urinary 6‑Sulfatoxymelatonin in Healthy Volunteers
Authors: I. Cerrillo, A. Carrillo-Vico, M. A. Ortega, B. Escudero-López, N. Álvarez-Sánchez, F. Martín, M. S. Fernández-Pachón
Abstract:
Melatonin is a bioactive compound involved in multiple biological activities such as glucose tolerance, circadian rhythm regulation, antioxidant defense or immune system action. In elderly subjects the intake of foods and drinks rich in melatonin is very important due to its endogenous level decreases with age. Alcoholic fermentation is a process carried out in fruits, vegetables and legumes to obtain new products with improved bioactive compounds profile in relation to original substrates. Alcoholic fermentation process carried out by Saccharomycetaceae var. Pichia kluyveri induces an important synthesis of melatonin in orange juice. A novel beverage derived of fermented orange juice could be a promising source of this bioactive compound. The aim of the present study was to determine whether the acute intake of fermented orange juice increase the levels of urinary 6-sulfatoxymelatonin in healthy humans. Nine healthy volunteers (7 women and 2 men), aged between 20 and 25 years old and BMI of 21.1 2.4 kg/m2, were recruited. On the study day, participants ingested 500 mL of fermented orange juice. The first urine collection was made before fermented orange juice consumption (basal). The rest of urine collections were made in the following time intervals after fermented orange juice consumption: 0-2, 2-5, 5-10, 10- 15 and 15-24 hours. During the experimental period only the consumption of water was allowed. At lunch time a meal was provided (60 g of white bread, two slices of ham, a slice of cheese, 125 g of sweetened natural yoghurt and water). The subjects repeated the protocol with orange juice following a 2-wk washout period between both types of beverages. The levels of 6-sulfatoxymelatonin (6-SMT) were measured in urine recollected at different time points using the Melatonin-Sulfate Urine ELISA (IBL International GMBH, Hamburg, Germany). Levels of 6-SMT were corrected to those of creatinine for each sample. A significant (p < 0.05) increase in urinary 6-SMT levels was observed between 2-5 hours after fermented orange juice ingestion with respect to basal values (increase of 67,8 %). The consumption of orange juice did not induce any significant change in urinary 6-SMT levels. In addition, urinary 6-SMT levels obtained between 2-5 hours after fermented orange juice ingestion (115,6 ng/mg) were significantly different (p < 0.05) from those of orange juice (42,4 ng/mg). The enhancement of urinary 6-SMT after the ingestion of 500 mL of fermented orange juice in healthy humans compared to orange juice could be an important advantage of this novel product as an excellent source of melatonin. Fermented orange juice could be a new functional food, and its consumption could exert a potentially positive effect on health in both the maintenance of health status and the prevention of chronic diseases.Keywords: fermented orange juice, functional beverage, healthy human, melatonin
Procedia PDF Downloads 412340 Ultrasound Assisted Alkaline Potassium Permanganate Pre-Treatment of Spent Coffee Waste
Authors: Rajeev Ravindran, Amit K. Jaiswal
Abstract:
Lignocellulose is the largest reservoir of inexpensive, renewable source of carbon. It is composed of lignin, cellulose and hemicellulose. Cellulose and hemicellulose is composed of reducing sugars glucose, xylose and several other monosaccharides which can be metabolised by microorganisms to produce several value added products such as biofuels, enzymes, aminoacids etc. Enzymatic treatment of lignocellulose leads to the release of monosaccharides such as glucose and xylose. However, factors such as the presence of lignin, crystalline cellulose, acetyl groups, pectin etc. contributes to recalcitrance restricting the effective enzymatic hydrolysis of cellulose and hemicellulose. In order to overcome these problems, pre-treatment of lignocellulose is generally carried out which essentially facilitate better degradation of lignocellulose. A range of pre-treatment strategy is commonly employed based on its mode of action viz. physical, chemical, biological and physico-chemical. However, existing pretreatment strategies result in lower sugar yield and formation of inhibitory compounds. In order to overcome these problems, we proposes a novel pre-treatment, which utilises the superior oxidising capacity of alkaline potassium permanganate assisted by ultra-sonication to break the covalent bonds in spent coffee waste to remove recalcitrant compounds such as lignin. The pre-treatment was conducted for 30 minutes using 2% (w/v) potassium permanganate at room temperature with solid to liquid ratio of 1:10. The pre-treated spent coffee waste (SCW) was subjected to enzymatic hydrolysis using enzymes cellulase and hemicellulase. Shake flask experiments were conducted with a working volume of 50mL buffer containing 1% substrate. The results showed that the novel pre-treatment strategy yielded 7 g/L of reducing sugar as compared to 3.71 g/L obtained from biomass that had undergone dilute acid hydrolysis after 24 hours. From the results obtained it is fairly certain that ultrasonication assists the oxidation of recalcitrant components in lignocellulose by potassium permanganate. Enzyme hydrolysis studies suggest that ultrasound assisted alkaline potassium permanganate pre-treatment is far superior over treatment by dilute acid. Furthermore, SEM, XRD and FTIR were carried out to analyse the effect of the new pre-treatment strategy on structure and crystallinity of pre-treated spent coffee wastes. This novel one-step pre-treatment strategy was implemented under mild conditions and exhibited high efficiency in the enzymatic hydrolysis of spent coffee waste. Further study and scale up is in progress in order to realise future industrial applications.Keywords: spent coffee waste, alkaline potassium permanganate, ultra-sonication, physical characterisation
Procedia PDF Downloads 360339 Comparative Histological, Immunohistochemical and Biochemical Study on the Effect of Vit. C, Vit. E, Gallic Acid and Silymarin on Carbon Tetrachloride Model of Liver Fibrosis in Rats
Authors: Safaa S. Hassan, Mohammed H. Elbakry, Safwat A. Mangoura, Zainab M. Omar
Abstract:
Background: Liver fibrosis is the main reason for increased mortality in chronic liver disease. It has no standard treatment. Antioxidants from a variety of sources are capable of slowing or preventing oxidation of other molecules. Aim: to evaluate the hepatoprotective effect of vit. C, vit. E and gallic acid in comparison to silymarin in the rat model of carbon tetrachloride induced liver fibrosis and their possible mechanisms of action. Material& Methods: A total number of 60 adult male albino rats 160-200gm were divided into six equal groups; received subcutaneous (s.c) injection for 8 weeks. Group I: as control. Group II: received 1.5 mL/kg of CCL4 .Group III: CCL4 and co- treatment with silymarin 100mg/kg p.o. daily. Group IV: CCL4 and co-treatment with vit. C 50mg/kg p.o. daily. Group V: CCL4 and co-treatment with vit. E 200mg/kg. p.o. Group VI: CCL4 and co-treatment with Gallic acid 100mg/kg. p.o. daily. Liver was processed for histological and immunohistochemical examination. Levels of AST, ALT, ALP, reduced GSH, MDA, SOD and hydroxyproline concentration were measured and evaluated statistically. Results: Light and electron microscopic examination of liver of group II exhibited foci of altered cells with dense nuclei and vacuolated, granular cytoplasm, mononuclear cell infiltration in portal areas, profuse collagen fiber deposits were found around portal tract, more intense staining α-SMA-positive cells occupied most of the liver fibrosis tissue, electron lucent areas in the cytoplasm of the hepatocytes, margination of nuclear chromatin. Treatment by any of the antioxidants variably reduced the hepatic structural changes induced by CCL4. Biochemical analysis showed that carbon tetrachloride significantly increased the levels of serum AST, ALT, ALP, hepatic malondialdehyde and hydroxyproline content. Moreover, it decreased the activities of superoxide dismutase and glutathione. Treatment with silymarin, gallic acid, vit. C and vit. E decreased significantly the AST, ALT, and ALP levels in plasma, MDA and hydroxyproline and increased the activities of SOD and glutathione in liver tissue. The effect of administration of CCl4 was improved with the used antioxidants in variable degrees. The most efficient antioxidant was silymarin followed by gallic acid and vit. C then vit. E. It is possibly due to their antioxidant effect, free radical scavenging properties and the reduction of oxidant dependent activation and proliferation of HSCs. Conclusion: So these antioxidants can be a promising drugs candidate for ameliorating liver fibrosis better than the use of the drugs and their side effects.Keywords: antioxidant, ccl4, gallic acid, liver fibrosis
Procedia PDF Downloads 275338 Social Vulnerability Mapping in New York City to Discuss Current Adaptation Practice
Authors: Diana Reckien
Abstract:
Vulnerability assessments are increasingly used to support policy-making in complex environments, like urban areas. Usually, vulnerability studies include the construction of aggregate (sub-) indices and the subsequent mapping of indices across an area of interest. Vulnerability studies show a couple of advantages: they are great communication tools, can inform a wider general debate about environmental issues, and can help allocating and efficiently targeting scarce resources for adaptation policy and planning. However, they also have a number of challenges: Vulnerability assessments are constructed on the basis of a wide range of methodologies and there is no single framework or methodology that has proven to serve best in certain environments, indicators vary highly according to the spatial scale used, different variables and metrics produce different results, and aggregate or composite vulnerability indicators that are mapped easily distort or bias the picture of vulnerability as they hide the underlying causes of vulnerability and level out conflicting reasons of vulnerability in space. So, there is urgent need to further develop the methodology of vulnerability studies towards a common framework, which is one reason of the paper. We introduce a social vulnerability approach, which is compared with other approaches of bio-physical or sectoral vulnerability studies relatively developed in terms of a common methodology for index construction, guidelines for mapping, assessment of sensitivity, and verification of variables. Two approaches are commonly pursued in the literature. The first one is an additive approach, in which all potentially influential variables are weighted according to their importance for the vulnerability aspect, and then added to form a composite vulnerability index per unit area. The second approach includes variable reduction, mostly Principal Component Analysis (PCA) that reduces the number of variables that are interrelated into a smaller number of less correlating components, which are also added to form a composite index. We test these two approaches of constructing indices on the area of New York City as well as two different metrics of variables used as input and compare the outcome for the 5 boroughs of NY. Our analysis yields that the mapping exercise yields particularly different results in the outer regions and parts of the boroughs, such as Outer Queens and Staten Island. However, some of these parts, particularly the coastal areas receive the highest attention in the current adaptation policy. We imply from this that the current adaptation policy and practice in NY might need to be discussed, as these outer urban areas show relatively low social vulnerability as compared with the more central parts, i.e. the high dense areas of Manhattan, Central Brooklyn, Central Queens and the Southern Bronx. The inner urban parts receive lesser adaptation attention, but bear a higher risk of damage in case of hazards in those areas. This is conceivable, e.g., during large heatwaves, which would more affect more the inner and poorer parts of the city as compared with the outer urban areas. In light of the recent planning practice of NY one needs to question and discuss who in NY makes adaptation policy for whom, but the presented analyses points towards an under representation of the needs of the socially vulnerable population, such as the poor, the elderly, and ethnic minorities, in the current adaptation practice in New York City.Keywords: vulnerability mapping, social vulnerability, additive approach, Principal Component Analysis (PCA), New York City, United States, adaptation, social sensitivity
Procedia PDF Downloads 398337 Learning Curve Effect on Materials Procurement Schedule of Multiple Sister Ships
Authors: Vijaya Dixit Aasheesh Dixit
Abstract:
Shipbuilding industry operates in Engineer Procure Construct (EPC) context. Product mix of a shipyard comprises of various types of ships like bulk carriers, tankers, barges, coast guard vessels, sub-marines etc. Each order is unique based on the type of ship and customized requirements, which are engineered into the product right from design stage. Thus, to execute every new project, a shipyard needs to upgrade its production expertise. As a result, over the long run, holistic learning occurs across different types of projects which contributes to the knowledge base of the shipyard. Simultaneously, in the short term, during execution of a project comprising of multiple sister ships, repetition of similar tasks leads to learning at activity level. This research aims to capture above learnings of a shipyard and incorporate learning curve effect in project scheduling and materials procurement to improve project performance. Extant literature provides support for the existence of such learnings in an organization. In shipbuilding, there are sequences of similar activities which are expected to exhibit learning curve behavior. For example, the nearly identical structural sub-blocks which are successively fabricated, erected, and outfitted with piping and electrical systems. Learning curve representation can model not only a decrease in mean completion time of an activity, but also a decrease in uncertainty of activity duration. Sister ships have similar material requirements. The same supplier base supplies materials for all the sister ships within a project. On one hand, this provides an opportunity to reduce transportation cost by batching the order quantities of multiple ships. On the other hand, it increases the inventory holding cost at shipyard and the risk of obsolescence. Further, due to learning curve effect the production scheduled of each consequent ship gets compressed. Thus, the material requirement schedule of every next ship differs from its previous ship. As more and more ships get constructed, compressed production schedules increase the possibility of batching the orders of sister ships. This work aims at integrating materials management with project scheduling of long duration projects for manufacturing of multiple sister ships. It incorporates the learning curve effect on progressively compressing material requirement schedules and addresses the above trade-off of transportation cost and inventory holding and shortage costs while satisfying budget constraints of various stages of the project. The activity durations and lead time of items are not crisp and are available in the form of probabilistic distribution. A Stochastic Mixed Integer Programming (SMIP) model is formulated which is solved using evolutionary algorithm. Its output provides ordering dates of items and degree of order batching for all types of items. Sensitivity analysis determines the threshold number of sister ships required in a project to leverage the advantage of learning curve effect in materials management decisions. This analysis will help materials managers to gain insights about the scenarios: when and to what degree is it beneficial to treat a multiple ship project as an integrated one by batching the order quantities and when and to what degree to practice distinctive procurement for individual ship.Keywords: learning curve, materials management, shipbuilding, sister ships
Procedia PDF Downloads 503336 Dengue Prevention and Control in Kaohsiung City
Authors: Chiu-Wen Chang, I-Yun Chang, Wei-Ting Chen, Hui-Ping Ho, Ruei-Hun Chang, Joh-Jong Huang
Abstract:
Kaohsiung City is located in the tropical region where has Aedes aegypti and Aedes albopictus distributed; once the virus invades, it’s can easily trigger local epidemic. Besides, Kaohsiung City has a world-class airport and harbor, trade and tourism are close and frequently with every country, especially with the Southeast Asian countries which also suffer from dengue. Therefore, Kaohsiung City faces the difficult challenge of dengue every year. The objectives of this study was to enhance dengue clinical care, border management and vector surveillance in Kaohsiung City by establishing an larger scale, innovatively and more coordinated dengue prevention and control strategies in 2016, including (1) Integrated medical programs: facilitated 657 contract medical institutions, widely set up NS1 rapid test in clinics, enhanced triage and referrals system, dengue case daily-monitoring management (2) Border quarantine: comprehensive NS1 screening for foreign workers and fisheries when immigration, hospitalization and isolation for suspected cases, health education for high risk groups (foreign students, other tourists) (3) Mosquito control: Widely use Gravitrap to monitor mosquito density in environment, use NS1 rapid screening test to detect community dengue virus (4) Health education: create a dengue app for people to immediately inquire the risk map and nearby medical resources, routine health education to all districts to strengthen public’s dengue knowledge, neighborhood cleaning awards program. The results showed that after new integration of dengue prevention and control strategies fully implemented in Kaohsiung City, the number of confirmed cases in 2016 declined to 342 cases, the majority of these cases are the continuation epidemic in 2015; in fact, only two cases confirmed after the 2016 summer. Besides, the dengue mortality rate successfully decreased to 0% in 2016. Moreover, according to the reporting rate from medical institutions in 2014 and 2016, it dropped from 27.07% to 19.45% from medical center, and it decreased from 36.55% to 29.79% from regional hospital; however, the reporting rate of district hospital increased from 11.88% to 15.87% and also increased from 24.51% to 34.89% in general practice clinics. Obviously, it showed that under the action of strengthening medical management, it reduced the medical center’s notification ratio and improved the notification ratio of general clinics which achieved the great effect of dengue clinical management and dengue control.Keywords: dengue control, integrated control strategies, clinical management, NS1
Procedia PDF Downloads 275335 From Avatars to Humans: A Hybrid World Theory and Human Computer Interaction Experimentations with Virtual Reality Technologies
Authors: Juan Pablo Bertuzzi, Mauro Chiarella
Abstract:
Employing a communication studies perspective and a socio-technological approach, this paper introduces a theoretical framework for understanding the concept of hybrid world; the avatarization phenomena; and the communicational archetype of co-hybridization. This analysis intends to make a contribution to future design of virtual reality experimental applications. Ultimately, this paper presents an ongoing research project that proposes the study of human-avatar interactions in digital educational environments, as well as an innovative reflection on inner digital communication. The aforementioned project presents the analysis of human-avatar interactions, through the development of an interactive experience in virtual reality. The goal is to generate an innovative communicational dimension that could reinforce the hypotheses presented throughout this paper. Being thought for its initial application in educational environments, the analysis and results of this research are dependent and have been prepared in regard of a meticulous planning of: the conception of a 3D digital platform; the interactive game objects; the AI or computer avatars; the human representation as hybrid avatars; and lastly, the potential of immersion, ergonomics and control diversity that can provide the virtual reality system and the game engine that were chosen. The project is divided in two main axes: The first part is the structural one, as it is mandatory for the construction of an original prototype. The 3D model is inspired by the physical space that belongs to an academic institution. The incorporation of smart objects, avatars, game mechanics, game objects, and a dialogue system will be part of the prototype. These elements have all the objective of gamifying the educational environment. To generate a continuous participation and a large amount of interactions, the digital world will be navigable both, in a conventional device and in a virtual reality system. This decision is made, practically, to facilitate the communication between students and teachers; and strategically, because it will help to a faster population of the digital environment. The second part is concentrated to content production and further data analysis. The challenge is to offer a scenario’s diversity that compels users to interact and to question their digital embodiment. The multipath narrative content that is being applied is focused on the subjects covered in this paper. Furthermore, the experience with virtual reality devices proposes users to experiment in a mixture of a seemingly infinite digital world and a small physical area of movement. This combination will lead the narrative content and it will be crucial in order to restrict user’s interactions. The main point is to stimulate and to grow in the user the need of his hybrid avatar’s help. By building an inner communication between user’s physicality and user’s digital extension, the interactions will serve as a self-guide through the gameworld. This is the first attempt to make explicit the avatarization phenomena and to further analyze the communicational archetype of co-hybridization. The challenge of the upcoming years will be to take advantage from these forms of generalized avatarization, in order to create awareness and establish innovative forms of hybridization.Keywords: avatar, hybrid worlds, socio-technology, virtual reality
Procedia PDF Downloads 147334 Physical Exam-Indicated Cerclage with Mesh Cap Prolonged Gestation on Average for 9 Weeks and 4 Days: 11 Years of Experience
Authors: M. Keršič, M. Lužnik, J. Lužnik
Abstract:
Cervical dilatation and membrane herniation before 26th week of gestation poses very high risk for extremely and very premature childbirth. Cerclage with mesh cap (mesh cerclage, MC) can greatly diminish this risk and provide additional positive effects. Between 2005 and 2014, MC has been performed in 9 patients with singleton pregnancies who had prolapsed membranes beyond external cervical/uterine os before 25th week of pregnancy (one in 29th). With patients in general anaesthesia, lithotomy and Trendelenburg position (about 25°) prolapsed membranes were repositioned in the uterine cavity, using tampon soaked in antiseptic solution (Skinsept mucosa). A circular, a type of purse-string suture (main band) with double string Ethilon 1 was applied at about 1 to 1.5 cm from the border of the external uterine os - 6 to 8 stitches were placed, so the whole external uterine os was encircled (modified McDonald). In the next step additional Ethilon 0 sutures were placed around all exposed parts of the main double circular suture and loosely tightened. On those sutures, round tailored (diameter around 6 cm) mesh (Prolene® or Gynemesh* PS) was attached. In all 9 cases, gestation was prolonged on average for 9 weeks and 4 days (67 days). In four cases maturity was achieved. Mesh was removed in 37th–38th week of pregnancy or if spontaneous labour began. In two cases, a caesarean section was performed because of breech presentation. In the first week after birth in 22nd week one new born died because of immaturity (premature birth was threatening in 18th week and then MC was placed). Ten years after first MC, 8 of 9 women with singleton pregnancy and MC performed have 8 healthy children from these pregnancies. Mesh cerclage successfully closed the opened cervical canal or uterine orifice and prevented further membrane herniation and membrane rupture. MC also provides a similar effect as with occluding the external os with suturing but without interrupting the way for excretion of abundant cervical mucus. The mesh also pulls the main circular band outwards and thus lowers the chance of suture cutting through the remaining cervix. MC prolonged gestation very successfully (mean for 9 weeks and 4 days) and thus increased possibility for survival and diminished the risk for complications in very early preterm delivered survivors in cases with cervical dilatation and membrane herniation before 26th week of gestation. Without action possibility to achieve at least 28th or 32nd week of gestation would be poor.Keywords: cervical insufficiency, mesh cerclage, membrane protrusion, premature birth prevention, physical exam-indicated cerclage, rescue cerclage
Procedia PDF Downloads 191333 Decoding Kinematic Characteristics of Finger Movement from Electrocorticography Using Classical Methods and Deep Convolutional Neural Networks
Authors: Ksenia Volkova, Artur Petrosyan, Ignatii Dubyshkin, Alexei Ossadtchi
Abstract:
Brain-computer interfaces are a growing research field producing many implementations that find use in different fields and are used for research and practical purposes. Despite the popularity of the implementations using non-invasive neuroimaging methods, radical improvement of the state channel bandwidth and, thus, decoding accuracy is only possible by using invasive techniques. Electrocorticography (ECoG) is a minimally invasive neuroimaging method that provides highly informative brain activity signals, effective analysis of which requires the use of machine learning methods that are able to learn representations of complex patterns. Deep learning is a family of machine learning algorithms that allow learning representations of data with multiple levels of abstraction. This study explores the potential of deep learning approaches for ECoG processing, decoding movement intentions and the perception of proprioceptive information. To obtain synchronous recording of kinematic movement characteristics and corresponding electrical brain activity, a series of experiments were carried out, during which subjects performed finger movements at their own pace. Finger movements were recorded with a three-axis accelerometer, while ECoG was synchronously registered from the electrode strips that were implanted over the contralateral sensorimotor cortex. Then, multichannel ECoG signals were used to track finger movement trajectory characterized by accelerometer signal. This process was carried out both causally and non-causally, using different position of the ECoG data segment with respect to the accelerometer data stream. The recorded data was split into training and testing sets, containing continuous non-overlapping fragments of the multichannel ECoG. A deep convolutional neural network was implemented and trained, using 1-second segments of ECoG data from the training dataset as input. To assess the decoding accuracy, correlation coefficient r between the output of the model and the accelerometer readings was computed. After optimization of hyperparameters and training, the deep learning model allowed reasonably accurate causal decoding of finger movement with correlation coefficient r = 0.8. In contrast, the classical Wiener-filter like approach was able to achieve only 0.56 in the causal decoding mode. In the noncausal case, the traditional approach reached the accuracy of r = 0.69, which may be due to the presence of additional proprioceptive information. This result demonstrates that the deep neural network was able to effectively find a representation of the complex top-down information related to the actual movement rather than proprioception. The sensitivity analysis shows physiologically plausible pictures of the extent to which individual features (channel, wavelet subband) are utilized during the decoding procedure. In conclusion, the results of this study have demonstrated that a combination of a minimally invasive neuroimaging technique such as ECoG and advanced machine learning approaches allows decoding motion with high accuracy. Such setup provides means for control of devices with a large number of degrees of freedom as well as exploratory studies of the complex neural processes underlying movement execution.Keywords: brain-computer interface, deep learning, ECoG, movement decoding, sensorimotor cortex
Procedia PDF Downloads 184