Search results for: mixed gaussian processes
7032 Towards an Environmental Knowledge System in Water Management
Authors: Mareike Dornhoefer, Madjid Fathi
Abstract:
Water supply and water quality are key problems of mankind at the moment and - due to increasing population - in the future. Management disciplines like water, environment and quality management therefore need to closely interact, to establish a high level of water quality and to guarantee water supply in all parts of the world. Groundwater remediation is one aspect in this process. From a knowledge management perspective it is only possible to solve complex ecological or environmental problems if different factors, expert knowledge of various stakeholders and formal regulations regarding water, waste or chemical management are interconnected in form of a knowledge base. In general knowledge management focuses the processes of gathering and representing existing and new knowledge in a way, which allows for inference or deduction of knowledge for e.g. a situation where a problem solution or decision support are required. A knowledge base is no sole data repository, but a key element in a knowledge based system, thus providing or allowing for inference mechanisms to deduct further knowledge from existing facts. In consequence this knowledge provides decision support. The given paper introduces an environmental knowledge system in water management. The proposed environmental knowledge system is part of a research concept called Green Knowledge Management. It applies semantic technologies or concepts such as ontology or linked open data to interconnect different data and information sources about environmental aspects, in this case, water quality, as well as background material enriching an established knowledge base. Examples for the aforementioned ecological or environmental factors threatening water quality are among others industrial pollution (e.g. leakage of chemicals), environmental changes (e.g. rise in temperature) or floods, where all kinds of waste are merged and transferred into natural water environments. Water quality is usually determined with the help of measuring different indicators (e.g. chemical or biological), which are gathered with the help of laboratory testing, continuous monitoring equipment or other measuring processes. During all of these processes data are gathered and stored in different databases. Meanwhile the knowledge base needs to be established through interconnecting data of these different data sources and enriching its semantics. Experts may add their knowledge or experiences of previous incidents or influencing factors. In consequence querying or inference mechanisms are applied for the deduction of coherence between indicators, predictive developments or environmental threats. Relevant processes or steps of action may be modeled in form of a rule based approach. Overall the environmental knowledge system supports the interconnection of information and adding semantics to create environmental knowledge about water environment, supply chain as well as quality. The proposed concept itself is a holistic approach, which links to associated disciplines like environmental and quality management. Quality indicators and quality management steps need to be considered e.g. for the process and inference layers of the environmental knowledge system, thus integrating the aforementioned management disciplines in one water management application.Keywords: water quality, environmental knowledge system, green knowledge management, semantic technologies, quality management
Procedia PDF Downloads 2227031 Cognitive Behavioral Modification in the Treatment of Aggressive Behavior in Children
Authors: Dijana Sulejmanović
Abstract:
Cognitive-behavioral modification (CBM) is a combination of cognitive and behavioral learning principles to shape and encourage the desired behaviors. A crucial element of cognitive-behavioral modification is that a change the behavior precedes awareness of how it affects others. CBM is oriented toward changing inner speech and learning to control behaviors through self-regulation techniques. It aims to teach individuals how to develop the ability to recognize, monitor and modify their thoughts, feelings, and behaviors. The review of literature emphasizes the efficiency the CBM approach in the treatment of children's hyperactivity and negative emotions such as anger. The results of earlier research show how impulsive and hyperactive behavior, agitation, and aggression may slow down and block the child from being able to actively monitor and participate in regular classes, resulting in the disruption of the classroom and the teaching process, and the children may feel rejected, isolated and develop long-term poor image of themselves and others. In this article, we will provide how the use of CBM, adapted to child's age, can incorporate measures of cognitive and emotional functioning which can help us to better understand the children’s cognitive processes, their cognitive strengths, and weaknesses, and to identify factors that may influence their behavioral and emotional regulation. Such a comprehensive evaluation can also help identify cognitive and emotional risk factors associated with aggressive behavior, specifically the processes involved in modulating and regulating cognition and emotions.Keywords: aggressive behavior, cognitive behavioral modification, cognitive behavioral theory, modification
Procedia PDF Downloads 3297030 A Soft Computing Approach Monitoring of Heavy Metals in Soil and Vegetables in the Republic of Macedonia
Authors: Vesna Karapetkovska Hristova, M. Ayaz Ahmad, Julijana Tomovska, Biljana Bogdanova Popov, Blagojce Najdovski
Abstract:
The average total concentrations of heavy metals; (cadmium [Cd], copper [Cu], nickel [Ni], lead [Pb], and zinc [Zn]) were analyzed in soil and vegetables samples collected from the different region of Macedonia during the years 2010-2012. Basic soil properties such as pH, organic matter and clay content were also included in the study. The average concentrations of Cd, Cu, Ni, Pb, Zn in the A horizon (0-30 cm) of agricultural soils were as follows, respectively: 0.25, 5.3, 6.9, 15.2, 26.3 mg kg-1 of soil. We have found that neural networking model can be considered as a tool for prediction and spatial analysis of the processes controlling the metal transfer within the soil-and vegetables. The predictive ability of such models is well over 80% as compared to 20% for typical regression models. A radial basic function network reflects good predicting accuracy and correlation coefficients between soil properties and metal content in vegetables much better than the back-propagation method. Neural Networking / soft computing can support the decision-making processes at different levels, including agro ecology, to improve crop management based on monitoring data and risk assessment of metal transfer from soils to vegetables.Keywords: soft computing approach, total concentrations, heavy metals, agricultural soils
Procedia PDF Downloads 3687029 Influence of Engaging Female Caregivers in Households with Adolescent Girls on Adopting Equitable Family Eating Practices: A Quasi-Experimental Study
Authors: Hanna Gulema, Meaza Demissie, Alemayehu Worku, Tesfaye Assebe Yadeta, Yemane Berhane
Abstract:
Background: In patriarchal societies, female caregivers decide on food allocation within a family based on prevailing gender and age norms, which may lead to inequality that does not favor young adolescent girls. This study evaluated the effect of a community-based social norm intervention involving female caregivers in West Hararghe, Ethiopia. The intervention was engaging female caregivers along with other adult influential community members to deliberate and act on food allocation social norms in a process referred to as Social Analysis and Action (SAA). Method: We used data from a large quasi-experimental study to compare family eating practices between those who participated in the Social Analyses and Action intervention and those who did not. The respondents were female caregivers in households with young adolescent girls (ages 13 and 14 years). The study’s outcome was the practice of family eating together from the same dish. The difference in difference (DID) analysis with the Mixed effect logistic regression model was used to examine the effect of the intervention. Result: The results showed improved family eating practices in both groups, but the improvement was greater in the intervention group. The DID analysis showed an 11.99 percentage points greater improvement in the intervention arm than in the control arm. The mixed-effect regression produced an adjusted odds ratio of 2.08 (95% CI [1.06–4.09]) after controlling selected covariates, p-value 0.033. Conclusions: The involvement of influential adult community members significantly improves the family practice of eating together in households where adolescent girls are present in our study. The intervention has great potential to minimize household food allocation inequalities and thus improve the nutritional status of young adolescents. Further studies are necessary to evaluate the effectiveness of the intervention in different social norm contexts to formulate policy and guidelines for scale-up.Keywords: family eating practice, social norm intervention, adolescence girls, caregiver
Procedia PDF Downloads 757028 Unveiling the Impact of Ultra High Vacuum Annealing Levels on Physico-Chemical Properties of Bulk ZnSe Semiconductor
Authors: Kheira Hamaida, Mohamed Salah Halati
Abstract:
In this current paper, our aim work is to link as possible the obtained simulation results and the other experimental ones, just focusing on the electronic and optical properties of ZnSe. The predictive spectra of the total and partial densities of states using the Full Potential Linearized/Augmented Plane Wave method with the newly Tran-Blaha (TB) modified Becke-Johnson (mBJ) exchange-correlation potential (EXC). So the upper valence energy (UVE) levels contain the relative contribution of Se-(4p and 3d) states with considerable contribution from the electrons of Zn-2s orbital. The dielectric function of w-ZnSe, with its two parts, appears with a noticeable anisotropy character. The microscopic origins of the electronic states that are responsible for the observed peaks in the spectrum are determined through the decomposition of the spectrum to the individual contributions of the electronic transitions between the pairs of bands, where Vi is an occupied state in the valence band, and Ci is an unoccupied state in the conduction band. X-PES (X Ray-Photo Electron Spectroscopy) is an important technique used to probe the homogeneity, stoichiometry, and purity state of the title compound. In order to check the electron transitions derived from simulations and the others from Reflected Electron Energy Loss Spectroscopy (REELS) technique which was of great sensitivity, is used to determine the interband electronic transitions. In the optical window (Eg), all the electron energy states created were also determined through the specific gaussian deconvolution of the photoluminescence spectrum (PLS) that probed under a room temperature (RT).Keywords: spectroscopy, WIEN2K, IIB-VIA semiconductors, dielectric function
Procedia PDF Downloads 677027 Studying Educational Processes through a Multifocal Viewpoint: Educational and Social Studies
Authors: Noa Shriki, Atara Shriki
Abstract:
Lifelong learning is considered as essential for teacher's professional development, which in turn has implications for the improvement of the entire education system. In recent years, many programs designed to support teachers' professional development are criticized for not achieving their goal. A variety of reasons have been proposed for the purpose of explaining the causes of the ineffectiveness of such programs. In this study, we put to test the possibility that teachers do not change as a result of their participation in professional programs due to a gap between the contents and approaches included in them and teacher's beliefs about teaching and learning. Eighteen elementary school mathematics teachers participated in the study. These teachers were involved in collaborating with their students in inquiring mathematical ideas, while implementing action research. Employing educational theories, the results indicated that this experience had a positive effect on teacher's professional development. In particular, there was an evident change in their beliefs regarding their role as mathematics teachers. However, while employing a different perspective for analyzing the data, the lens of Kurt Lewin's theory of re-education, we realized that this change of beliefs must be questioned. Therefore, it is suggested that analysis of educational processes should be carried out not only through common educational theories, but also on the basis of social and organizational theories. It is assumed that both the field of education and the fields of social studies and organizational consulting will benefit from the multifocal viewpointKeywords: educational theories, professional development, re-education, teachers' beliefs
Procedia PDF Downloads 1427026 Phenomena-Based Approach for Automated Generation of Process Options and Process Models
Authors: Parminder Kaur Heer, Alexei Lapkin
Abstract:
Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.Keywords: Phenomena, Process intensification, Process models , Process options
Procedia PDF Downloads 2347025 Making the Neighbourhood: Analyzing Mapping Procedures to Deal with Plurality and Conflict
Authors: Barbara Roosen, Oswald Devisch
Abstract:
Spatial projects are often contested. Despite participatory trajectories in official spatial development processes, citizens engage often by their power to say no. Participatory mapping helps to produce more legible and democratic ways of decision-making. It has proven its value in producing a multitude of knowledges and views, for individuals and community groups and local stakeholders to imagine desired and undesired futures and to give them the rhetorical power to present their views throughout the development process. From this perspective, mapping works as a social process in which individuals and groups share their knowledge, learn from each other and negotiate their relationship with each other as well as with space and power. In this way, these processes eventually aim to activate communities to intervene in cooperation in real problems. However, these are fragile and bumpy processes, sometimes leading to (local) conflict and intractable situations. Heterogeneous subjectivities and knowledge that become visible during the mapping process and which are contested by members of the community, is often the first trigger. This paper discusses a participatory mapping project conducted in a residential subdivision in Flanders to provide a deeper understanding of how or under which conditions the mapping process could moderate discordant situations amongst inhabitants, local organisations and local authorities, towards a more constructive outcome. In our opinion, this implies a thorough documentation and presentation of the different steps of the mapping process to design and moderate an open and transparent dialogue. The mapping project ‘Make the Neighbourhood’, is set up in the aftermath of a socio-spatial design intervention in the neighbourhood that led to polarization within the community. To start negotiation between the diverse claims that came to the fore, we co-create a desired future map of the neighbourhood together with local organisations and inhabitants as a way to engage them in the development of a new spatial development plan for the area. This mapping initiative set up a new ‘common’ goal or concern, as a first step to bridge the gap that we experienced between different sociocultural groups, bottom-up and top-down initiatives and between professionals and non-professionals. An atlas of elements (materials), an atlas of actors with different roles and an atlas of ways of cooperation and organisation form the work and building material of the future neighbourhood map, assembled in two co-creation sessions. Firstly, we will consider how the mapping procedures articulate the plurality of claims and agendas. Secondly, we will elaborate upon how social relations and spatialities are negotiated and reproduced during the different steps of the map making. Thirdly, we will reflect on the role of the rules, format, and structure of the mapping process in moderating negotiations between much divided claims. To conclude, we will discuss the challenges of visualizing the different steps of mapping process as a strategy to moderate tense negotiations in a more constructive direction in the context of spatial development processes.Keywords: conflict, documentation, participatory mapping, residential subdivision
Procedia PDF Downloads 2117024 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet
Authors: Amir Moslemi, Amir movafeghi, Shahab Moradi
Abstract:
One of the most important challenging factors in medical images is nominated as noise.Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjected to low quality due to the noise. The quality of CT images is dependent on the absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on the purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete wavelet transform(DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result in good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).Keywords: computed tomography (CT), noise reduction, curve-let, contour-let, signal to noise peak-peak ratio (PSNR), structure similarity (Ssim), absorbed dose to patient (ADP)
Procedia PDF Downloads 4437023 Tuberculosis in Humans and Animals in the Eastern Part of the Sudan
Authors: Yassir Adam Shuaib, Stefan Niemann, Eltahir Awad Khalil, Ulrich Schaible, Lothar Heinz Wieler, Mohammed Ahmed Bakhiet, Abbashar Osman Mohammed, Mohamed Abdelsalam Abdalla, Elvira Richter
Abstract:
Tuberculosis (TB) is a chronic bacterial disease of humans and animals and it is characterized by the progressive development of specific granulomatous tubercle lesions in affected tissues. In a six-month study, from June to November 2014, a total of 2,304 carcasses of cattle, camel, sheep, and goats slaughtered at East and West Gaash slaughterhouses, Kassala, were investigated during postmortem, in parallel, 101 sputum samples from TB suspected patients at Kassala and El-Gadarif Teaching Hospitals were collected in order to investigate tuberculosis in animals and humans. Only 0.1% carcasses were found with suspected TB lesions in the liver and lung and peritoneal cavity of two sheep and no tuberculous lesions were found in the carcasses of cattle, goats or camels. All samples, tissue lesions and sputum, were decontaminated by the NALC-NaOH method and cultured for mycobacterial growth at the NRZ for Mycobacteria, Research Center Borstel, Germany. Genotyping and molecular characterization of the grown strains were done by line probe assay (GenoType CM and MTBC) and 16S rDNA, rpoB gene, and ITS sequencing, spoligotyping, MIRU-VNTR typing and next generation sequencing (NGS). Culture of the specimens revealed growth of organisms from 81.6% of all samples. Mycobacterium tuberculosis (76.2%), M. intracellulare (14.2%), mixed infection with M. tuberculosis and M. intracellulare (6.0%) and mixed infection with M. tuberculosis and M. fortuitum and with M. intracellulare and unknown species (1.2%) were detected in the sputum samples and unknown species (1.2%) were detected in the samples of one of the animals tissues. From the 69 M. tuberculosis strains, 25 (36.2%) were showing either mono-drug-resistant or multi-drug-resistant or poly-drug-resistant but none was extensively drug-resistant. In conclusion, the prevalence of TB in animals was very low while in humans M. tuberculosis-Delhi/CAS lineage was responsible for most cases and there was an evidence of MDR transmission and acquisition.Keywords: animal, human, slaughterhouse, Sudan, tuberculosis
Procedia PDF Downloads 3717022 A Systemic Maturity Model
Authors: Emir H. Pernet, Jeimy J. Cano
Abstract:
Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control promulgated by Shewhart in the years 30, and on the principles of PDCA continuous improvement (Plan, Do, Check, Act) developed by Deming and Juran. Some frameworks developed over the concept of maturity models includes COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them based on points of reflection and analysis done by some authors. Almost all limitations are related to the mechanistic and reductionist approach of the principles over those models are built. As Systems Theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises from as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of an organization, and finally validated by the measuring of maturity in organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.Keywords: GRC, maturity model, systems theory, viable system model
Procedia PDF Downloads 3137021 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 497020 Interior Architecture in the Anthropocene: Engaging the Subnature through the Intensification of Body-Surface Interaction
Authors: Verarisa Ujung
Abstract:
The Anthropocene – as scientists define as a new geological epoch where human intervention has the dominant influence on the geological, atmospheric, and ecological processes challenges the contemporary discourse in architecture and interior. The dominant influence characterises the incapability to distinguish the notion of nature, subnature, human and non-human. Consequently, living in the Anthropocene demands sensitivity and responsiveness to heighten our sense of the rhythm of transformation and recognition of our environment as a product of natural, social and historical processes. The notion of subnature is particularly emphasised in this paper to investigate the poetic sense of living with subnature. It could be associated with the critical tool for exploring the aesthetic and programmatic implications of subnature on interiority. The ephemeral immaterial attached to subnature promotes the sense of atmospheric delineation of interiority, the very inner significance of body-surface interaction, which central to interior architecture discourse. This would then reflect human’s activities; examine the transformative change, the architectural motion and the traces that left between moments. In this way, engaging the notion of subnature enable us to better understand the critical subject on interiority and might provide an in-depth study on interior architecture. Incorporating the exploration on the form, materiality, and pattern of subnature, this research seeks to grasp the inner significance of micro to macro approaches so that the future of interior might be compelled to depend more on the investigation and development of responsive environment. To reflect upon the form, materiality and intensity of subnature that specifically characterized by the natural, social and historical processes, this research examines a volcanic land, White Island/Whakaari, New Zealand as the chosen site of investigation. Emitting various forms and intensities of subnatures - smokes, mud, sulphur gas, this volcanic land also open to the new inhabitation within the sulphur factory ruins that reflects human’s past occupation. In this way, temporal and natural selected manifestations of materiality, artefact, and performance can be traced out and might reveal the meaningful relations among space, inhabitation, and well-being of inhabitants in the Anthropocene.Keywords: anthropocene, body, intensification, intensity, interior architecture, subnature, surface
Procedia PDF Downloads 1777019 A Review on Factors Influencing Implementation of Secure Software Development Practices
Authors: Sri Lakshmi Kanniah, Mohd Naz’ri Mahrin
Abstract:
More and more businesses and services are depending on software to run their daily operations and business services. At the same time, cyber-attacks are becoming more covert and sophisticated, posing threats to software. Vulnerabilities exist in the software due to the lack of security practices during the phases of software development. Implementation of secure software development practices can improve the resistance to attacks. Many methods, models and standards for secure software development have been developed. However, despite the efforts, they still come up against difficulties in their deployment and the processes are not institutionalized. There is a set of factors that influence the successful deployment of secure software development processes. In this study, the methodology and results from a systematic literature review of factors influencing the implementation of secure software development practices is described. A total of 44 primary studies were analysed as a result of the systematic review. As a result of the study, a list of twenty factors has been identified. Some of factors that affect implementation of secure software development practices are: Involvement of the security expert, integration between security and development team, developer’s skill and expertise, development time and communication between stakeholders. The factors were further classified into four categories which are institutional context, people and action, project content and system development process. The results obtained show that it is important to take into account organizational, technical and people issues in order to implement secure software development initiatives.Keywords: secure software development, software development, software security, systematic literature review
Procedia PDF Downloads 3817018 Radical Scavenging Activity of Protein Extracts from Pulse and Oleaginous Seeds
Authors: Silvia Gastaldello, Maria Grillo, Luca Tassoni, Claudio Maran, Stefano Balbo
Abstract:
Antioxidants are nowadays attractive not only for the countless benefits to the human and animal health, but also for the perspective of use as food preservative instead of synthetic chemical molecules. In this study, the radical scavenging activity of six protein extracts from pulse and oleaginous seeds was evaluated. The selected matrices are Pisum sativum (yellow pea from two different origins), Carthamus tinctorius (safflower), Helianthus annuus (sunflower), Lupinus luteus cv Mister (lupin) and Glycine max (soybean), since they are economically interesting for both human and animal nutrition. The seeds were grinded and proteins extracted from 20mg powder with a specific vegetal-extraction kit. Proteins have been quantified through Bradford protocol and scavenging activity was revealed using DPPH assay, based on radical DPPH (2,2-diphenyl-1-picrylhydrazyl) absorbance decrease in the presence of antioxidants molecules. Different concentrations of the protein extract (1, 5, 10, 50, 100, 500 µg/ml) were mixed with DPPH solution (DPPH 0,004% in ethanol 70% v/v). Ascorbic acid was used as a scavenging activity standard reference, at the same six concentrations of protein extracts, while DPPH solution was used as control. Samples and standard were prepared in triplicate and incubated for 30 minutes in dark at room temperature, the absorbance was read at 517nm (ABS30). Average and standard deviation of absorbance values were calculated for each concentration of samples and standard. Statistical analysis using t-students and p-value were performed to assess the statistical significance of the scavenging activity difference between the samples (or standard) and control (ABSctrl). The percentage of antioxidant activity has been calculated using the formula [(ABSctrl-ABS30)/ABSctrl]*100. The obtained results demonstrate that all matrices showed antioxidant activity. Ascorbic acid, used as standard, exhibits a 96% scavenging activity at the concentration of 500 µg/ml. At the same conditions, sunflower, safflower and yellow peas revealed the highest antioxidant performance among the matrices analyzed, with an activity of 74%, 68% and 70% respectively (p < 0.005). Although lupin and soybean exhibit a lower antioxidant activity compared to the other matrices, they showed a percentage of 46 and 36 respectively. All these data suggest the possibility to use undervalued edible matrices as antioxidants source. However, further studies are necessary to investigate a possible synergic effect of several matrices as well as the impact of industrial processes for a large-scale approach.Keywords: antioxidants, DPPH assay, natural matrices, vegetal proteins
Procedia PDF Downloads 4347017 Mixed-Methods Analyses of Subjective Strategies of Most Unlikely but Successful Transitions from Social Benefits to Work
Authors: Hirseland Andreas, Kerschbaumer Lukas
Abstract:
In the case of Germany, there are about one million long-term unemployed – a figure that did not vary much during the past years. These long-term unemployed did not benefit from the prospering labor market while most short-term unemployed did. Instead, they are continuously dependent on welfare and sometimes precarious short-term employment, experiencing work poverty. Long-term unemployment thus turns into a main obstacle to become employed again, especially if it is accompanied by other impediments such as low-level education (school/vocational), poor health (especially chronical illness), advanced age (older than fifty), immigrant status, motherhood or engagement in care for other relatives. As can be shown by this current research project, in these cases the chance to regain employment decreases to near nil. Almost two-thirds of all welfare recipients have multiple impediments which hinder a successful transition from welfare back to sustainable and sufficient employment. Prospective employers are unlikely to hire long-term unemployed with additional impediments because they evaluate potential employees on their negative signaling (e.g. low-level education) and the implicit assumption of unproductiveness (e.g. poor health, age). Some findings of the panel survey “Labor market and social security” (PASS) carried out by the Institute of Employment Research (the research institute of the German Federal Labor Agency) spread a ray of hope, showing that unlikely does not necessarily mean impossible. The presentation reports on current research on these very scarce “success stories” of unlikely transitions from long-term unemployment to work and how these cases were able to perform this switch against all odds. The study is based on a mixed-method design. Within the panel survey (~15,000 respondents in ~10,000 households), only 66 cases of such unlikely transitions were observed. These cases have been explored by qualitative inquiry – in depth-interviews and qualitative network techniques. There is strong evidence that sustainable transitions are influenced by certain biographical resources like habits of network use, a set of informal skills and particularly a resilient way of dealing with obstacles, combined with contextual factors rather than by job-placement procedures promoted by Job-Centers according to activation rules or by following formal paths of application. On the employer’s side small and medium-sized enterprises are often found to give job opportunities to a wider variety of applicants, often based on a slow but steadily increasing relationship leading to employment. According to these results it is possible to show and discuss some limitations of (German) activation policies targeting the labor market and their impact on welfare dependency and long-term unemployment. Based on these findings, indications for more supportive small-scale measures in the field of labor-market policies are suggested to help long-term unemployed with multiple impediments to overcome their situation (e.g. organizing small-scale-structures and low-threshold services to encounter possible employers on a more informal basis like “meet and greet”).Keywords: against-all-odds, mixed-methods, Welfare State, long-term unemployment
Procedia PDF Downloads 3657016 Methylphenidate and Placebo Effect on Brain Activity and Basketball Free Throw: A Randomized Controlled Trial
Authors: Mohammad Khazaei, Reza Rostami, Hasan Gharayagh Zandi, Rouhollah Basatnia, Mahbubeh Ghayour Najafabadi
Abstract:
Objective: Methylphenidate has been demonstrated to enhance attention and cognitive processes, and placebo treatments have also been found to improve attention and cognitive processes. Additionally, methylphenidate may have positive effects on motion perception and sports performance. Nevertheless, additional research is needed to fully comprehend the neural mechanisms underlying the effects of methylphenidate and placebo on cognitive and motor functions. Methods: In this randomized controlled trial, 18 young semi-professional basketball players aged 18-23 years were randomly and equally assigned to either a Ritalin or Placebo group. The participants performed 20 consecutive free throws; their scores were recorded on a 0-3 scale. The participants’ brain activity was recorded using electroencephalography (EEG) for 5 minutes seated with their eyes closed. The Ritalin group received a 10 mg dose of methylphenidate, while the Placebo group received a 10mg dose of placebo. The EEG was obtained 90 minutes after the drug was administere Results: There was no significant difference in the absolute power of brain waves between the pre-test and post-tests in the Placebo group. However, in the Ritalin group, a significant difference in the absolute power of brain waves was observed in the Theta band (5-6 Hz) and Beta band (21-30 Hz) between pre- and post-tests in Fp2, F8, and Fp1. In these areas, the absolute power of Beta waves was higher during the post-test than during the pre-test. The Placebo group showed a more significant difference in free throw scores than the Ritalin group. Conclusions: In conclusion, these results suggest that Ritalin effect on brain activity in areas associated with attention and cognitive processes, as well as improve basketball free throws. However, there was no significant placebo effect on brain activity performance, but it significantly affected the improvement of free throws. Further research is needed to fully understand the effects of methylphenidate and placebo on cognitive and motor functions.Keywords: methylphenidate, placebo effect, electroencephalography, basketball free throw
Procedia PDF Downloads 807015 Hybrid Learning and Testing at times of Corona: A Case Study at an English Department
Authors: Mimoun Melliti
Abstract:
In the wake of the global pandemic, educational systems worldwide faced unprecedented challenges and had to swiftly adapt to new conditions. This necessitated a fundamental shift in assessment processes, as traditional in-person exams became impractical. The present paper aims to investigate how educational systems have adapted to the new conditions imposed by the outbreak of the pandemic. This paper serves as a case study documenting the various decisions, conditions, experiments, and outcomes associated with transitioning the assessment processes of a higher education institution to a fully online format. The participants of this study consisted of 4666 students from health, engineering, science, and humanities disciplines, who were enrolled in general English (Eng101/104) and English for specific purposes (Eng102/113) courses at a preparatory year institution in Saudi Arabia. The findings of this study indicate that online assessment can be effectively implemented given the fulfillment of specific requirements. These prerequisites encompass the presence of competent staff, administrative flexibility, and the availability of necessary infrastructure and technological support. The significance of this case study lies in its comprehensive description of the various steps and measures undertaken to adapt to the "new normal" situation. Furthermore, it evaluates the impact of these measures and offers detailed recommendations for potential similar future scenarios.Keywords: hybrid learning, testing, adaptive teaching, EFL
Procedia PDF Downloads 627014 Modified Single-Folded Potentials for the Alpha-²⁴Mg and Alpha-²⁸Si Elastic Scattering
Authors: M. N. A. Abdullah, Pritha Roy, R. R. Shil, D. R. Sarker
Abstract:
Alpha-nucleus interaction is obscured because it produces enhanced cross-sections at large scattering angles known as anomaly in large angle scattering (ALAS). ALAS is prominent in the elastic scattering of α-particles as well as in non-elastic processes involving α-particles for incident energies up to 50 MeV and for targets of mass A ≤ 50. The Woods-Saxon type of optical model potential fails to describe the processes in a consistent manner. Folded potential is a good candidate and often used to construct the potential which is derived from the microscopic as well as semi-microscopic folding calculations. The present work reports the analyses of the elastic scattering of α-particles from ²⁴Mg and ²⁸Si at Eα=22-100 MeV and 14.4-120 MeV incident energies respectively in terms of the modified single-folded (MSF) potential. To derive the MSF potential, we take the view that the nucleons in the target nuclei ²⁴Mg and ²⁸Si are primarily in α-like clusters and the rest of the time in unclustered nucleonic configuration. The MSF potential, found in this study, does not need any renormalization over the whole range of incident α energies, and the renormalization factor has been found to be exactly 1 for both the targets. The best-fit parameters yield 4Aα = 21 and AN = 3 for α-²⁴Mg potential, and 4Aα = 26 and AN = 2 for α-²⁸Si potential in time-average pictures. The root-mean-square radii of both ²⁴Mg and ²⁸Si are also deduced, and the results obtained from this work agree well with the outcomes of other studies.Keywords: elastic scattering, optical model, folded potential, renormalization
Procedia PDF Downloads 2257013 Conformance to Spatial Planning between the Kampala Physical Development Plan of 2012 and the Existing Land Use in 2021
Authors: Brendah Nagula, Omolo Fredrick Okalebo, Ronald Ssengendo, Ivan Bamweyana
Abstract:
The Kampala Physical Development Plan (KPDP) was developed in 2012 and projected both long term and short term developments within the City .The purpose of the plan was to not only shape the city into a spatially planned area but also to control the urban sprawl trends that had expanded with pronounced instances of informal settlements. This plan was approved by the National Physical Planning Board and a signature was appended by the Minister in 2013. Much as the KPDP plan has been implemented using different approaches such as detailed planning, development control, subdivision planning, carrying out construction inspections, greening and beautification, there is still limited knowledge on the level of conformance towards this plan. Therefore, it is yet to be determined whether it has been effective in shaping the City into an ideal spatially planned area. Attaining a clear picture of the level of conformance towards the KPDP 2012 through evaluation between the planned and the existing land use in Kampala City was performed. Methods such as Supervised Classification and Post Classification Change Detection were adopted to perform this evaluation. Scrutiny of findings revealed Central Division registered the lowest level of conformance to the planning standards specified in the KPDP 2012 followed by Nakawa, Rubaga, Kawempe, and Makindye. Furthermore, mixed-use development was identified as the land use with the highest level of non-conformity of 25.11% and institutional land use registered the highest level of conformance of 84.45 %. The results show that the aspect of location was not carefully considered while allocating uses in the KPDP whereby areas located near the Central Business District have higher land rents and hence require uses that ensure profit maximization. Also, the prominence of development towards mixed-use denotes an increased demand for land towards compact development that was not catered for in the plan. Therefore in order to transform Kampala city into a spatially planned area, there is need to carefully develop detailed plans especially for all the Central Division planning precincts indicating considerations for land use densification.Keywords: spatial plan, post classification change detection, Kampala city, landuse
Procedia PDF Downloads 967012 Levels of Reflection in Engineers EFL Learners: The Path to Content and Language Integrated Learning Implementation in Chilean Higher Education
Authors: Sebastián Olivares Lizana, Marianna Oyanedel González
Abstract:
This study takes part of a major project based on implementing a CLIL program (Content and Language Integrated Learning) at Universidad Técnica Federico Santa María, a leading Chilean tertiary Institution. It aims at examining the relationship between the development of Reflective Processes (RP) and Cognitive Academic Language Proficiency (CALP) in weekly learning logs written by faculty members, participants of an initial professional development online course on English for Academic Purposes (EAP). Such course was designed with a genre-based approach, and consists of multiple tasks directed to academic writing proficiency. The results of this analysis will be described and classified in a scale of key indicators that represent both the Reflective Processes and the advances in CALP, and that also consider linguistic proficiency and task progression. Such indicators will evidence affordances and constrains of using a genre-based approach in an EFL Engineering CLIL program implementation at tertiary level in Chile, and will serve as the starting point to the design of a professional development course directed to teaching methodologies in a CLIL EFL environment in Engineering education at Universidad Técnica Federico Santa María.Keywords: EFL, EAL, genre, CLIL, engineering
Procedia PDF Downloads 3987011 Common Space Production as a Solution to the Affordable Housing Problem: Its Relationship with the Squating Process in Turkey
Authors: Gözde Arzu Sarıcan
Abstract:
Contemporary urbanization processes and spatial transformations are intensely debated across various fields of social sciences. One prominent concept in these discussions is "common spaces." Common spaces offer a critical theoretical framework, particularly for addressing the social and economic inequalities brought about by urbanization. This study examines the processes of commoning and their impacts through the lens of squatter neighborhoods in Turkey, emphasizing the importance of affordable housing. It focuses on the role and significance of these neighborhoods in the formation of common spaces, analyzing the collective actions and resistance strategies of residents. This process, which began with the construction of shelters to meet the shelter needs of low-income households migrating from rural to urban areas, has turned into low-quality squatter settlements over time. For low-income households lacking the economic power to rent or buy homes in the city, these areas provided an affordable housing solution. Squatter neighborhoods reflect the efforts of local communities to protect and develop their communal living spaces through collective actions and resistance strategies. This collective creation process involves the appropriation of occupied land as a common resource through the rules established by the commons. Organized occupations subdivide these lands, shaped through collective creation processes. For the squatter communities striving for economic and social adaptation, these areas serve as buffer zones for urban integration. In squatter neighborhoods, bonds of friendship, kinship, and compatriotism are strong, playing a significant role in the creation and dissemination of collective knowledge. Squatter areas can be described as common spaces that emerge out of necessity for low-income and marginalized groups. The design and construction of housing in squatter neighborhoods are shaped by the collective participation and skills of the residents. Streets are formed through collective decision-making and labor. Over time, the demands for housing are communicated to local authorities, enhancing the potential for commoning. Common spaces are shaped by collective needs and demands, appropriated, and transformed into potential new spaces. Common spaces are continually redefined and recreated. In this context, affordable housing becomes an essential aspect of these common spaces, providing a foundation for social and economic stability. This study evaluates the processes of commoning and their effects through the lens of squatter neighborhoods in Turkey. Communities living in squatter neighborhoods have managed to create and protect communal living spaces, especially in situations where official authorities have been inadequate. Common spaces are built on values such as solidarity, cooperation, and collective resistance. In urban planning and policy development processes, it is crucial to consider the concept of common spaces. Policies that support the collective efforts and resistance strategies of communities can contribute to more just and sustainable living conditions in urban areas. In this context, the concept of common spaces is considered an important tool in the fight against urban inequalities and in the expression and defense mechanisms of communities. By emphasizing the importance of affordable housing within these spaces, this study highlights the critical role of common spaces in addressing urban social and economic challenges.Keywords: affordable housing, common space, squating process, turkey
Procedia PDF Downloads 367010 Contribution of Supply Chain Management Practices for Enhancing Healthcare Service Quality: A Quantitative Analysis in Delhi’s Healthcare Sector
Authors: Chitrangi Gupta, Arvind Bhardwaj
Abstract:
This study seeks to investigate and quantify the influence of various dimensions of supply chain management (namely, supplier relationships, compatibility, specifications and standards, delivery processes, and after-sales service) on distinct dimensions of healthcare service quality (specifically, responsiveness, trustworthiness, and security) within the operational framework of XYZ Superspeciality Hospital, situated in Delhi. The name of the Hospital is not being mentioned here because of the privacy policy of the hospital. The primary objective of this research is to elucidate the impact of supply chain management practices on the overall quality of healthcare services offered within hospital settings. Employing a quantitative research design, this study utilizes a hypothesis-testing approach to systematically discern the relationship between supply chain management dimensions and the quality of health services. The findings of this study underscore the significant influence exerted by supply chain management dimensions, specifically supplier relationships, specifications and standards, delivery processes, and after-sales service, on the enhancement of healthcare service quality. Moreover, the study's results reveal that demographic factors such as gender, qualifications, age, and experience do not yield discernible disparities in the relationship between supply chain management and healthcare service quality.Keywords: supply chain management, healthcare, hospital operations, service delivery
Procedia PDF Downloads 697009 Continuous Plug Flow and Discrete Particle Phase Coupling Using Triangular Parcels
Authors: Anders Schou Simonsen, Thomas Condra, Kim Sørensen
Abstract:
Various processes are modelled using a discrete phase, where particles are seeded from a source. Such particles can represent liquid water droplets, which are affecting the continuous phase by exchanging thermal energy, momentum, species etc. Discrete phases are typically modelled using parcel, which represents a collection of particles, which share properties such as temperature, velocity etc. When coupling the phases, the exchange rates are integrated over the cell, in which the parcel is located. This can cause spikes and fluctuating exchange rates. This paper presents an alternative method of coupling a discrete and a continuous plug flow phase. This is done using triangular parcels, which span between nodes following the dynamics of single droplets. Thus, the triangular parcels are propagated using the corner nodes. At each time step, the exchange rates are spatially integrated over the surface of the triangular parcels, which yields a smooth continuous exchange rate to the continuous phase. The results shows that the method is more stable, converges slightly faster and yields smooth exchange rates compared with the steam tube approach. However, the computational requirements are about five times greater, so the applicability of the alternative method should be limited to processes, where the exchange rates are important. The overall balances of the exchanged properties did not change significantly using the new approach.Keywords: CFD, coupling, discrete phase, parcel
Procedia PDF Downloads 2687008 Characterization and Modelling of Aerosol Droplet in Absorption Columns
Authors: Hammad Majeed, Hanna Knuutila, Magne Hillestad, Hallvard F. Svendsen
Abstract:
Formation of aerosols can cause serious complications in industrial exhaust gas CO2 capture processes. SO3 present in the flue gas can cause aerosol formation in an absorption based capture process. Small mist droplets and fog formed can normally not be removed in conventional demisting equipment because their submicron size allows the particles or droplets to follow the gas flow. As a consequence of this aerosol based emissions in the order of grams per Nm3 have been identified from PCCC plants. In absorption processes aerosols are generated by spontaneous condensation or desublimation processes in supersaturated gas phases. Undesired aerosol development may lead to amine emissions many times larger than what would be encountered in a mist free gas phase in PCCC development. It is thus of crucial importance to understand the formation and build-up of these aerosols in order to mitigate the problem. Rigorous modelling of aerosol dynamics leads to a system of partial differential equations. In order to understand mechanics of a particle entering an absorber an implementation of the model is created in Matlab. The model predicts the droplet size, the droplet internal variable profiles and the mass transfer fluxes as function of position in the absorber. The Matlab model is based on a subclass method of weighted residuals for boundary value problems named, orthogonal collocation method. The model comprises a set of mass transfer equations for transferring components and the essential diffusion reaction equations to describe the droplet internal profiles for all relevant constituents. Also included is heat transfer across the interface and inside the droplet. This paper presents results describing the basic simulation tool for the characterization of aerosols formed in CO2 absorption columns and gives examples as to how various entering droplets grow or shrink through an absorber and how their composition changes with respect to time. Below are given some preliminary simulation results for an aerosol droplet composition and temperature profiles.Keywords: absorption columns, aerosol formation, amine emissions, internal droplet profiles, monoethanolamine (MEA), post combustion CO2 capture, simulation
Procedia PDF Downloads 2467007 Modeling Environmental, Social, and Governance Financial Assets with Lévy Subordinated Processes and Option Pricing
Authors: Abootaleb Shirvani, Svetlozar Rachev
Abstract:
ESG stands for Environmental, Social, and Governance and is a non-financial factor that investors use to specify material risks and growth opportunities in their analysis process. ESG ratings provide a quantitative measure of socially responsible investment, and it is essential to incorporate ESG ratings when modeling the dynamics of asset returns. In this article, we propose a triple subordinated Lévy process for incorporating numeric ESG ratings into dynamic asset pricing theory to model the time series properties of the stock returns. The motivation for introducing three layers of subordinator is twofold. The first two layers of subordinator capture the skew and fat-tailed properties of the stock return distribution that cannot be explained well by the existing Lévy subordinated model. The third layer of the subordinator introduces ESG valuation and incorporates numeric ESG ratings into dynamic asset pricing theory and option pricing. We employ the triple subordinator Lévy model for developing the ESG-valued stock return model, derive the implied ESG score surfaces for Microsoft, Apple, and Amazon stock returns, and compare the shape of the ESG implied surface scores for these stocks.Keywords: ESG scores, dynamic asset pricing theory, multiple subordinated modeling, Lévy processes, option pricing
Procedia PDF Downloads 847006 The Power of Geography in the Multipolar World Order
Authors: Norbert Csizmadia
Abstract:
The paper is based on a thorough investigation regarding the recent global, social and geographical processes. The ‘Geofusion’ book series by the author guides the readers with the help of newly illustrated “associative” geographic maps of the global world in the 21st century through the quest for the winning nations, communities, leaders and powers of this age. Hence, the above mentioned represent the research objectives, the preliminary findings of which are presented in this paper. The most significant recognition is that scientists who are recognized as explorers, geostrategists of this century, in this case, are expected to present guidelines for our new world full of global social and economic challenges. To do so, new maps are needed which do not miss the wisdom and tools of the old but complement them with the new structure of knowledge. Using the lately discovered geographic and economic interrelations, the study behind this presentation tries to give a prognosis of the global processes. The methodology applied contains the survey and analysis of many recent publications worldwide regarding geostrategic, cultural, geographical, social, and economic surveys structured into global networks. In conclusion, the author presents the result of the study, which is a collage of the global map of the 21st century as mentioned above, and it can be considered as a potential contribution to the recent scientific literature on the topic. In summary, this paper displays the results of several-year-long research giving the audience an image of how economic navigation tools can help investors, politicians and travelers to get along in the changing new world.Keywords: geography, economic geography, geo-fusion, geostrategy
Procedia PDF Downloads 1337005 Effect of Nanoparticles Concentration, pH and Agitation on Bioethanol Production by Saccharomyces cerevisiae BY4743: An Optimization Study
Authors: Adeyemi Isaac Sanusi, Gueguim E. B. Kana
Abstract:
Nanoparticles have received attention of the scientific community due to their biotechnological potentials. They exhibit advantageous size, shape and concentration-dependent catalytic, stabilizing, immunoassays and immobilization properties. This study investigates the impact of metallic oxide nanoparticles (NPs) on ethanol production by Saccharomyces cerevisiae BY4743. Nine different nanoparticles were synthesized using precipitation method and microwave treatment. The nanoparticles synthesized were characterized by Fourier Transform Infra-Red spectroscopy (FTIR), scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Fermentation processes were carried out at varied NPs concentrations (0 – 0.08 wt%). Highest ethanol concentrations were achieved after 24 h using Cobalt NPs (5.07 g/l), Copper NPs (4.86 g/l) and Manganese NPs (4.74 g/l) at 0.01 wt% NPs concentrations, which represent 13%, 8.7% and 5.4% increase respectively over the control (4.47 g/l). The lowest ethanol concentration (0.17 g/l) was obtained when 0.08 wt% of Silver NPs was used. And lower ethanol concentrations were observed at higher NPs concentration. Ethanol concentration decrease after 24 h for all the processes. In all set up with NPs, the pH was observed to be stable and the stability was directly proportional to nanoparticles concentrations. These findings suggest that the presence of some of the NPs in the bioprocesses has catalytic and pH stabilizing potential. Ethanol production by Saccharomyces cerevisiae BY4743 was enhanced in the presence of Cobalt NPs, Copper NPs and Manganese NPs. Optimization study using response surface methodology (RSM) will further elucidate the impact of these nanoparticles on bioethanol production.Keywords: agitation, bioethanol, nanoparticles concentration, optimization, pH value
Procedia PDF Downloads 1887004 Selective Fermentations of Monosaccharides by Osmotolerant Yeast Cultures
Authors: Elizabeth Loza-Valerdi, Victor Pardiñas-Rios, Arnulfo Pluma-Pluma, Andres Breton-Toral, Julio Cercado-Jaramillo
Abstract:
The purification processes for mixtures of isomeric monosaccharides using industrial chromatographic methods poses a serious technical challenge. Mixtures of 2 or 3 monosaccharides are difficult to separate by strictly physical or chemical techniques. Differential fermentation by microbial cultures is an increasingly interesting way of selective enrichment in a particular kind of monosaccharides when a mixture of them is present in the solution, and only one has economical value. Osmotolerant yeast cultures provide an interesting source of biocatalysts for the selective catabolism of monosaccharides in media containing high concentrations of total soluble sugars. A collection of 398 yeast strains has been obtained using endemic and unique sources of fruit juices, industrial syrups, honey, and other high sugar content substrates, either natural or man made, products and by-products from Mexico. The osmotolerance of the strains was assessed by plate assay both in glucose (20-40-60%w/w). Strains were classified according to their osmotolerance in low, medium or highly tolerant to high glucose concentrations. The purified cultures were tested by their ability to growth in a solid plate media or liquid media of Yeas Nitrogen Base (YNB), added with specific monosaccharides as sole carbon source (glucose, galactose, lactose and fructose). Selected strains were subsequently tested in fermentation experiments with mixtures of two monosaccharides (galactose/glucose and glucose/fructose). Their ability to grow and selectively catabolize one monosaccharide was evaluated. Growth, fermentation activity and products of metabolism were determined by plate counts, CO2 production, turbidity and chromatographic analysis by HPLC. Selective catabolism of one monosaccharide in liquid media containing two monosaccharides was confirmed for 8 strains. Ion Exchange chromatographic processes were used in production of high fructose or galactose syrup. Laboratory scale processes for the production of fructose or galactose enriched syrups is now feasible, with important applications in food (like high fructose syrup as edulcorant) and fermentation technology (for GOS production).Keywords: osmotolerant yeasts, selective metabolism, fructose syrup, GOS
Procedia PDF Downloads 4507003 Diminishing Voices of Children in Mandatory Mediation Schemes
Authors: Yuliya Radanova, Agnė Tvaronavičienė
Abstract:
With the growing trend for mandating parties of family conflicts to out-of-court processes, the adopted statutory regulations often remain silent on the way the voice of the child is integrated into the procedure. Convention on the Rights of the Child (Art. 12) clearly states the obligation to assure to the child who can form his or her own views the right to express those views freely in all matters affecting him. This article seeks to explore the way children participate in the mandatory mediation schemes applicable to family disputes in the European Union. A review of scientific literature and empirical data has been conducted on those EU Member States that coerce parties to family mediation to establish that different models of practice are deployed, and there is a lack of synchronicity on how children’s role in mediation is viewed. Child-inclusive mediation processes are deemed to produce sustainable results over time but necessitate professional qualifications and skills for the purpose of mediators to accommodate that such discussions are aligned with the best interest of the child. However, there is no unanimous guidance, standards or protocols on the peculiar characteristics and manner through which children are involved in mediation. Herewith, it is suggested that the lack of such rigorous approaches and coherence in an ever-changing mediation setting transitioning towards mandatory mediation models jeopardizes the importance of children’s voices in the process. Thus, it is suggested that there is a need to consider the adoption of uniform guidelines on the specific role children have in mediation, particularly in its mandatory models.Keywords: family mediation, child involvement, mandatory mediation, child-inclusive, child-focused
Procedia PDF Downloads 76