Search results for: mixed methodology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7912

Search results for: mixed methodology

6652 Being Authentic is the New “Pieces”: A Mixed Methods Study on Authenticity among African Christian Millennials

Authors: Victor Counted

Abstract:

Staying true to self is complicated. In most cases, we might not fully come to terms with this realities. Just like any journey, a self-discovery experience with the ‘self’, is like a rollercoaster ride. The researcher attempts to engage the reader in an empirical study on authenticity tendencies of African Christian Millennials. Hence, attempting the all-important question: What does it actually mean to be true to self for the African youth? A comprehensive, yet an unfinished business that applies the authenticity theory in its exploratory navigations to uncover the “lived world” of the participants who were part of this study. Using a mixed methods approach, the researcher will exhaustively give account to the authenticity tendencies and experiences of the respondents in the study by providing the reader with a unique narrative for understanding what it means to be true to oneself in Africa. At the quantitative study, the participants recorded higher scores on the Authenticity Scale (AS) authentic living, while showing a significant correlation within the subscales. Hypotheses were tested at the quantitative phase, which statistically supported gender and church affiliation as possible predictors for the authenticity orientations of the participants, while being a Christian native and race/ethnicity were not impact factors statistically. The results helped the researcher to develop the objectives behind the qualitative study, where only fifteen AS-authentic living participants were interviewed to understand why they scored high on authentic living, in order to understand what it means to be authentic. The hallmark of the qualitative case study exploration was the common coping mechanism of splitting adopted by the respondents to deal with their self-crisis as they tried to remain authentic to self, whilst self-regulating and self-investing the self to discover ‘self’. Specifically, the researcher observed the concurrent utilization of some kind of the religious-self by the respondents to regulate their self crisis, as they relate with self fragmenting through different splitting stages in hope for some kind of redemption. It was an explanation that led to the conclusion that being authentic is the new pieces. Authenticity is in fragments. This proposition led the researcher to introduce a hermeneutical support-system that will enable future researchers engage more critically and responsibly with their “living human documents” in order to inspire timely solutions that resolve the concerns of authenticity and wellbeing among Millennials in Africa.

Keywords: authenticity, self, identity, self-fragmentation, weak self integration, postmodern self, splitting

Procedia PDF Downloads 520
6651 Culvert Blockage Evaluation Using Australian Rainfall And Runoff 2019

Authors: Rob Leslie, Taher Karimian

Abstract:

The blockage of cross drainage structures is a risk that needs to be understood and managed or lessened through the design. A blockage is a random event, influenced by site-specific factors, which needs to be quantified for design. Under and overestimation of blockage can have major impacts on flood risk and cost associated with drainage structures. The importance of this matter is heightened for those projects located within sensitive lands. It is a particularly complex problem for large linear infrastructure projects (e.g., rail corridors) located within floodplains where blockage factors can influence flooding upstream and downstream of the infrastructure. The selection of the appropriate blockage factors for hydraulic modeling has been subject to extensive research by hydraulic engineers. This paper has been prepared to review the current Australian Rainfall and Runoff 2019 (ARR 2019) methodology for blockage assessment by applying this method to a transport corridor brownfield upgrade case study in New South Wales. The results of applying the method are also validated against asset data and maintenance records. ARR 2019 – Book 6, Chapter 6 includes advice and an approach for estimating the blockage of bridges and culverts. This paper concentrates specifically on the blockage of cross drainage structures. The method has been developed to estimate the blockage level for culverts affected by sediment or debris due to flooding. The objective of the approach is to evaluate a numerical blockage factor that can be utilized in a hydraulic assessment of cross drainage structures. The project included an assessment of over 200 cross drainage structures. In order to estimate a blockage factor for use in the hydraulic model, a process has been advanced that considers the qualitative factors (e.g., Debris type, debris availability) and site-specific hydraulic factors that influence blockage. A site rating associated with the debris potential (i.e., availability, transportability, mobility) at each crossing was completed using the method outlined in ARR 2019 guidelines. The hydraulic results inputs (i.e., flow velocity, flow depth) and qualitative factors at each crossing were developed into an advanced spreadsheet where the design blockage level for cross drainage structures were determined based on the condition relating Inlet Clear Width and L10 (average length of the longest 10% of the debris reaching the site) and the Adjusted Debris Potential. Asset data, including site photos and maintenance records, were then reviewed and compared with the blockage assessment to check the validity of the results. The results of this assessment demonstrate that the estimated blockage factors at each crossing location using ARR 2019 guidelines are well-validated with the asset data. The primary finding of the study is that the ARR 2019 methodology is a suitable approach for culvert blockage assessment that has been validated against a case study spanning a large geographical area and multiple sub-catchments. The study also found that the methodology can be effectively coded within a spreadsheet or similar analytical tool to automate its application.

Keywords: ARR 2019, blockage, culverts, methodology

Procedia PDF Downloads 355
6650 Development of a Myocardial Patch with 3D Hydrogel Electrical Stimulation System

Authors: Yung-Gi Chen, Pei-Leun Kang, Yu-Hsin Lin, Shwu-Jen Chang

Abstract:

Myocardial tissue has limited self-repair ability due to its loss of differentiation characteristic for most mature cardiomyocytes. Therefore, the effective use of stem cell technology in regenerative medicine is an important development to alleviate the current difficulties in cardiac disease treatment. The main purpose of this project was to develop a 3-D hydrogel electrical stimulating system for promoting the differentiation of stem cells into myocardial cells, and the patch will be used to repair damaged myocardial tissue. This project was focused on the preparation of the electrical stimulation system with carbon/CaCl₂ electrodes covered with carbon nanotube-hydrogel. In this study, we utilized screen imprinting techniques and used Poly(lactic-co-glycolic acid)(PLGA) membranes as printing substrates to fabricate a carbon/CaCl₂ interdigitated electrode that covered with alginate/carbon nanotube hydrogels. The single-walled carbon nanotube was added in the hydrogel to enhance the mechanical strength and conductivity of hydrogel. In this study, we used PLGA (85:15) as electrode preparing substrate. The CaCl₂/ EtOH solution (80% w/v) was mixed into carbon paste to prepare various concentration calcium-containing carbon paste (2.5%, 5%, 7.5%, 10% v/v). Different concentrations of alginate (1%, 1.5%, 2% v/v) and SWCNT(Diameter < 2nm, length between 5-15μm) (1, 1.5, 3 mg/ml) are gently immobilized on the electrode by cross-linking with calcium chloride. The three-dimensional hydrogel electrode was tested for its redox efficiency by cyclic voltammetry to determine the optimal parameters for the hydrogel electrode preparation. From the result of the final electrodes, it indicated that the electrode was not easy to maintain the pattern of the interdigitated electrode when the concentration of calcium of chloride was more than 10%. According to the gel rate test and cyclic voltammetry experiment results showed the SWCNT could increase the electron conduction of hydrogel electrodes significantly. So far the 3D electrode system has been completed, 2% alginate mixed with 3mg SWCNT is the optimal condition to construct the most complete structure for the hydrogel preparation.

Keywords: myocardial tissue engineering, screen printing technology, poly (lactic-co-glycolic acid), alginate, single walled carbon nanotube

Procedia PDF Downloads 109
6649 Cars Redistribution Optimization Problem in the Free-Float Car-Sharing

Authors: Amine Ait-Ouahmed, Didier Josselin, Fen Zhou

Abstract:

Free-Float car-sharing is an one-way car-sharing service where cars are available anytime and anywhere in the streets such that no dedicated stations are needed. This means that after driving a car you can park it anywhere. This car-sharing system creates an imbalance car distribution in the cites which can be regulated by staff agents through the redistribution of cars. In this paper, we aim to solve the car-reservation and agents traveling problem so that the number of successful cars’ reservations could be maximized. Beside, we also tend to minimize the distance traveled by agents for cars redistribution. To this end, we present a mixed integer linear programming formulation for the car-sharing problem.

Keywords: one-way car-sharing, vehicle redistribution, car reservation, linear programming

Procedia PDF Downloads 344
6648 Landslide Vulnerability Assessment in Context with Indian Himalayan

Authors: Neha Gupta

Abstract:

Landslide vulnerability is considered as the crucial parameter for the assessment of landslide risk. The term vulnerability defined as the damage or degree of elements at risk of different dimensions, i.e., physical, social, economic, and environmental dimensions. Himalaya region is very prone to multi-hazard such as floods, forest fires, earthquakes, and landslides. With the increases in fatalities rates, loss of infrastructure, and economy due to landslide in the Himalaya region, leads to the assessment of vulnerability. In this study, a methodology to measure the combination of vulnerability dimension, i.e., social vulnerability, physical vulnerability, and environmental vulnerability in one framework. A combined result of these vulnerabilities has rarely been carried out. But no such approach was applied in the Indian Scenario. The methodology was applied in an area of east Sikkim Himalaya, India. The physical vulnerability comprises of building footprint layer extracted from remote sensing data and Google Earth imaginary. The social vulnerability was assessed by using population density based on land use. The land use map was derived from a high-resolution satellite image, and for environment vulnerability assessment NDVI, forest, agriculture land, distance from the river were assessed from remote sensing and DEM. The classes of social vulnerability, physical vulnerability, and environment vulnerability were normalized at the scale of 0 (no loss) to 1 (loss) to get the homogenous dataset. Then the Multi-Criteria Analysis (MCA) was used to assign individual weights to each dimension and then integrate it into one frame. The final vulnerability was further classified into four classes from very low to very high.

Keywords: landslide, multi-criteria analysis, MCA, physical vulnerability, social vulnerability

Procedia PDF Downloads 299
6647 The Development of Explicit Pragmatic Knowledge: An Exploratory Study

Authors: Aisha Siddiqa

Abstract:

The knowledge of pragmatic practices in a particular language is considered key to effective communication. Unlike one’s native language where this knowledge is acquired spontaneously, more conscious attention is required to learn second language pragmatics. Traditional foreign language (FL) classrooms generally focus on the acquisition of vocabulary and lexico-grammatical structures, neglecting pragmatic functions that are essential for effective communication in the multilingual networks of the modern world. In terms of effective communication, of particular importance is knowledge of what is perceived as polite or impolite in a certain language, an aspect of pragmatics which is not perceived as obligatory but is nonetheless indispensable for successful intercultural communication and integration. While learning a second language, the acquisition of politeness assumes more prominence as the politeness norms and practices vary according to language and culture. Therefore, along with focusing on the ‘use’ of politeness strategies, it is crucial to examine the ‘acquisition’ and the ‘acquisitional development’ of politeness strategies by second language learners, particularly, by lower proficiency leaners as the norms of politeness are usually focused in lower levels. Hence, there is an obvious need for a study that not only investigates the acquisition of pragmatics by young FL learners using innovative multiple methods; but also identifies the potential causes of the gaps in their development. The present research employs a cross sectional design to explore the acquisition of politeness by young English as a foreign language learners (EFL) in France; at three levels of secondary school learning. The methodology involves two phases. In the first phase a cartoon oral production task (COPT) is used to elicit samples of requests from young EFL learners in French schools. These data are then supplemented by a) role plays, b) an analysis of textbooks, and c) video recordings of classroom activities. This mixed method approach allows us to explore the repertoire of politeness strategies the learners possess and delve deeper into the opportunities available to learners in classrooms to learn politeness strategies in requests. The paper will provide the results of the analysis of COPT data for 250 learners at three different stages of English as foreign language development. Data analysis is based on categorization of requests developed in CCSARP project. The preliminary analysis of the COPT data shows that there is substantial evidence of pragmalinguistic development across all levels but the developmental process seems to gain momentum in the second half of the secondary school period as compared to the early period at school. However, there is very little evidence of sociopragmatic development. The study aims to document the current classroom practices in France by looking at the development of young EFL learner’s politeness strategies across three levels of secondary schools.

Keywords: acquisition, English, France, interlanguage pragmatics, politeness

Procedia PDF Downloads 423
6646 Benefits of Tourist Experiences for Families: A Systematic Literature Review Using Nvivo

Authors: Diana Cunha, Catarina Coelho, Ana Paula Relvas, Elisabeth Kastenholz

Abstract:

Context: Tourist experiences have a recognized impact on the well-being of individuals. However, studies on the specific benefits of tourist experiences for families are scattered across different disciplines. This study aims to systematically review the literature to synthesize the evidence on the benefits of tourist experiences for families. Research Aim: The main objective is to systematize the evidence in the literature regarding the benefits of tourist experiences for families. Methodology: A systematic literature review was conducted using Nvivo, analyzing 33 scientific studies obtained from various databases. The search terms used were "family"/ "couple" and "tourist experience". The studies included quantitative, qualitative, mixed methods, and literature reviews. All works prior to the year 2000 were excluded, and the search was restricted to full text. A language filter was also used, considering articles in Portuguese, English, and Spanish. For NVivo analysis, information was coded based on both deductive and inductive perspectives. To minimize the subjectivity of the selection and coding process, two of the authors discussed the process and agreed on criteria that would make the coding more objective. Once the coding process in NVivo was completed, the data relating to the identification/characterization of the works were exported to the Statistical Package for the Social Sciences (SPPS), to characterize the sample. Findings: The results highlight that tourist experiences have several benefits for family systems, including the strengthening of family and marital bonds, the creation of family memories, and overall well-being and life satisfaction. These benefits contribute to both immediate relationship quality improvement and long-term family identity construction and transgenerational transmission. Theoretical Importance: This study emphasizes the systemic nature of the effects and relationships within family systems. It also shows that no harm was reported within these experiences, with only some challenges related to positive outcomes. Data Collection and Analysis Procedures: The study collected data from 33 scientific studies published predominantly after 2013. The data were analyzed using Nvivo, employing a systematic review approach. Question Addressed: The study addresses the question of the benefits of tourist experiences for families and how these experiences contribute to family functioning and individual well-being. Conclusion: Tourist experiences provide opportunities for families to enhance their interpersonal relationships and create lasting memories. The findings suggest that formal interventions based on evidence could further enhance the potential benefits of these experiences and be a valuable preventive tool in therapeutic interventions.

Keywords: family systems, individual and family well-being, marital satisfaction, tourist experiences

Procedia PDF Downloads 68
6645 Assessment of the Impact of Teaching Methodology on Skill Acquisition in Music Education among Students in Emmanuel Alayande University of Education, Oyo

Authors: Omotayo Abidemi Funmilayo

Abstract:

Skill acquisition in professional fields has been prioritized and considered important to demonstrate the mastery of subject matter and present oneself as an expert in such profession. The ability to acquire skills in different fields, however calls for different method from the instructor or teacher during training. Music is not an exception of such profession, where there exist different area of skills acquisition require practical performance. This paper, however, focused on the impact and effects of different methods on acquisition of practical knowledge in the handling of some musical instruments among the students of Emmanuel Alayande College of Education, Oyo. In this study, 30 students were selected and divided into two groups based on the selected area of learning, further division were made on each of the two major groups to consist of five students each, to be trained using different methodology for two months and three hours per week. Comparison of skill acquired were made using standard research instrument at reliable level of significance, test were carried out on the thirty students considered for the study based on area of skill acquisition. The students that were trained on the keyboard and saxophone using play way method, followed by the students that were trained using demonstration method while the set of students that received teaching instruction through lecture method performed below average. In conclusion, the study reveals that ability to acquire professional skill on handling musical instruments are better enhanced using play way method.

Keywords: music education, skill acquisition, keyboard, saxophone

Procedia PDF Downloads 70
6644 Integrating Inference, Simulation and Deduction in Molecular Domain Analysis and Synthesis with Peculiar Attention to Drug Discovery

Authors: Diego Liberati

Abstract:

Standard molecular modeling is traditionally done through Schroedinger equations via the help of powerful tools helping to manage them atom by atom, often needing High Performance Computing. Here, a full portfolio of new tools, conjugating statistical inference in the so called eXplainable Artificial Intelligence framework (in the form of Machine Learning of understandable rules) to the more traditional modeling and simulation control theory of mixed dynamic logic hybrid processes, is offered as quite a general purpose even if making an example to a popular chemical physics set of problems.

Keywords: understandable rules ML, k-means, PCA, PieceWise Affine Auto Regression with eXogenous input

Procedia PDF Downloads 28
6643 Development and Compositional Analysis of Functional Bread and Biscuit from Soybean, Peas and Rice Flour

Authors: Jean Paul Hategekimana, Bampire Claudine, Niyonsenga Nadia, Irakoze Josiane

Abstract:

Peas, soybeans and rice are crops which are grown in Rwanda and are available in rural and urban local markets and they give contribution in reduction of health problems especially in fighting malnutrition and food insecurity in Rwanda. Several research activities have been conducted on how cereals flour can be mixed with legumes flour for developing baked products which are rich in protein, fiber, minerals as they are found in legumes. However, such activity was not yet well studied in Rwanda. The aim of the present study was to develop bread and biscuit products from peas, soybeans and rice as functional ingredients combined with wheat flour and then analyze the nutritional content and consumer acceptability of new developed products. The malnutrition problem can be reduced by producing bread and biscuits which are rich in protein and are very accessible for every individual. The processing of bread and biscuit were made by taking peas flour, soybeans flour and rice flour mixed with wheat flour and other ingredients then a dough was made followed by baking. For bread, two kind of products were processed, for each product one control and three experimental samples in different three ratios of peas and rice were prepared. These ratios were 95:5, 90:10 and 80:20 for bread from peas and 85:5:10, 80:10:10 and 70:10:20 for bread from peas and rice. For biscuit, two kind of products were also processed, for each product one control sample and three experimental samples in three different ratios were prepared. These ratios are 90:5:5,80:10:10 and 70:10:20 for biscuit from peas and rice and 90:5:5,80:10:10 and 70:10:20 for biscuit from soybean and rice. All samples including the control sample were analyzed for the consumer acceptability (sensory attributes) and nutritional composition. For sensory analysis, bread from of peas and rice flour with wheat flour at ratio 85:5:10 and bread from peas only as functional ingredient with wheat flour at ratio 95:5 and biscuits made from a of soybeans and rice at a ratio 90:5:5 and biscuit made from peas and rice at ratio 90:5:5 were most acceptable compared to control sample and other samples in different ratio. The moisture, protein, fat, fiber and minerals (Sodium and iron.) content were analyzed where bread from peas in all ratios was found to be rich in protein and fiber compare to control sample and biscuit from soybean and rice in all ratios was found to be rich in protein and fiber compare to control sample.

Keywords: bakery products, peas and rice flour, wheat flour, sensory evaluation, proximate composition

Procedia PDF Downloads 63
6642 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 111
6641 The Mental Workload of Intensive Care Unit Nurses in Performing Human-Machine Tasks: A Cross-Sectional Survey

Authors: Yan Yan, Erhong Sun, Lin Peng, Xuchun Ye

Abstract:

Aims: The present study aimed to explore Intensive Care Unit (ICU) nurses’ mental workload (MWL) and associated factors with it in performing human-machine tasks. Background: A wide range of emerging technologies have penetrated widely in the field of health care, and ICU nurses are facing a dramatic increase in nursing human-machine tasks. However, there is still a paucity of literature reporting on the general MWL of ICU nurses performing human-machine tasks and the associated influencing factors. Methods: A cross-sectional survey was employed. The data was collected from January to February 2021 from 9 tertiary hospitals in 6 provinces (Shanghai, Gansu, Guangdong, Liaoning, Shandong, and Hubei). Two-stage sampling was used to recruit eligible ICU nurses (n=427). The data were collected with an electronic questionnaire comprising sociodemographic characteristics and the measures of MWL, self-efficacy, system usability, and task difficulty. The univariate analysis, two-way analysis of variance (ANOVA), and a linear mixed model were used for data analysis. Results: Overall, the mental workload of ICU nurses in performing human-machine tasks was medium (score 52.04 on a 0-100 scale). Among the typical nursing human-machine tasks selected, the MWL of ICU nurses in completing first aid and life support tasks (‘Using a defibrillator to defibrillate’ and ‘Use of ventilator’) was significantly higher than others (p < .001). And ICU nurses’ MWL in performing human-machine tasks was also associated with age (p = .001), professional title (p = .002), years of working in ICU (p < .001), willingness to study emerging technology actively (p = .006), task difficulty (p < .001), and system usability (p < .001). Conclusion: The MWL of ICU nurses is at a moderate level in the context of a rapid increase in nursing human-machine tasks. However, there are significant differences in MWL when performing different types of human-machine tasks, and MWL can be influenced by a combination of factors. Nursing managers need to develop intervention strategies in multiple ways. Implications for practice: Multidimensional approaches are required to perform human-machine tasks better, including enhancing nurses' willingness to learn emerging technologies actively, developing training strategies that vary with tasks, and identifying obstacles in the process of human-machine system interaction.

Keywords: mental workload, nurse, ICU, human-machine, tasks, cross-sectional study, linear mixed model, China

Procedia PDF Downloads 69
6640 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 741
6639 Engineering of E-Learning Content Creation: Case Study for African Countries

Authors: María-Dolores Afonso-Suárez, Nayra Pumar-Carreras, Juan Ruiz-Alzola

Abstract:

This research addresses the use of an e-Learning creation methodology for learning objects. Throughout the process, indicators are being gathered, to determine if it responds to the main objectives of an engineering discipline. These parameters will also indicate if it is necessary to review the creation cycle and readjust any phase. Within the project developed for this study, apart from the use of structured methods, there has been a central objective: the establishment of a learning atmosphere. A place where all the professionals involved are able to collaborate, plan, solve problems and determine guides to follow in order to develop creative and innovative solutions. It has been outlined as a blended learning program with an assessment plan that proposes face to face lessons, coaching, collaboration, multimedia and web based learning objects as well as support resources. The project has been drawn as a long term task, the pilot teaching actions designed provide the preliminary results object of study. This methodology is been used in the creation of learning content for the African countries of Senegal, Mauritania and Cape Verde. It has been developed within the framework of the MACbioIDi, an Interreg European project for the International cooperation and development. The educational area of this project is focused in the training and advice of professionals of the medicine as well as engineers in the use of applications of medical imaging technology, specifically the 3DSlicer application and the Open Anatomy Browser.

Keywords: teaching contents engineering, e-learning, blended learning, international cooperation, 3dslicer, open anatomy browser

Procedia PDF Downloads 172
6638 Efficiency and Equity in Italian Secondary School

Authors: Giorgia Zotti

Abstract:

This research comprehensively investigates the multifaceted interplay determining school performance, individual backgrounds, and regional disparities within the landscape of Italian secondary education. Leveraging data gleaned from the INVALSI 2021-2022 database, the analysis meticulously scrutinizes two fundamental distributions of educational achievements: the standardized Invalsi test scores and official grades in Italian and Mathematics, focusing specifically on final-year secondary school students in Italy. Applying a comprehensive methodology, the study initially employs Data Envelopment Analysis (DEA) to assess school performances. This methodology involves constructing a production function encompassing inputs (hours spent at school) and outputs (Invalsi scores in Italian and Mathematics, along with official grades in Italian and Math). The DEA approach is applied in both of its versions: traditional and conditional. The latter incorporates environmental variables such as school type, size, demographics, technological resources, and socio-economic indicators. Additionally, the analysis delves into regional disparities by leveraging the Theil Index, providing insights into disparities within and between regions. Moreover, in the frame of the inequality of opportunity theory, the study quantifies the inequality of opportunity in students' educational achievements. The methodology applied is the Parametric Approach in the ex-ante version, considering diverse circumstances like parental education and occupation, gender, school region, birthplace, and language spoken at home. Consequently, a Shapley decomposition is applied to understand how much each circumstance affects the outcomes. The outcomes of this comprehensive investigation unveil pivotal determinants of school performance, notably highlighting the influence of school type (Liceo) and socioeconomic status. The research unveils regional disparities, elucidating instances where specific schools outperform others in official grades compared to Invalsi scores, shedding light on the intricate nature of regional educational inequalities. Furthermore, it emphasizes a heightened inequality of opportunity within the distribution of Invalsi test scores in contrast to official grades, underscoring pronounced disparities at the student level. This analysis provides insights for policymakers, educators, and stakeholders, fostering a nuanced understanding of the complexities within Italian secondary education.

Keywords: inequality, education, efficiency, DEA approach

Procedia PDF Downloads 75
6637 Exploration and Evaluation of the Effect of Multiple Countermeasures on Road Safety

Authors: Atheer Al-Nuaimi, Harry Evdorides

Abstract:

Every day many people die or get disabled or injured on roads around the world, which necessitates more specific treatments for transportation safety issues. International road assessment program (iRAP) model is one of the comprehensive road safety models which accounting for many factors that affect road safety in a cost-effective way in low and middle income countries. In iRAP model road safety has been divided into five star ratings from 1 star (the lowest level) to 5 star (the highest level). These star ratings are based on star rating score which is calculated by iRAP methodology depending on road attributes, traffic volumes and operating speeds. The outcome of iRAP methodology are the treatments that can be used to improve road safety and reduce fatalities and serious injuries (FSI) numbers. These countermeasures can be used separately as a single countermeasure or mix as multiple countermeasures for a location. There is general agreement that the adequacy of a countermeasure is liable to consistent losses when it is utilized as a part of mix with different countermeasures. That is, accident diminishment appraisals of individual countermeasures cannot be easily added together. The iRAP model philosophy makes utilization of a multiple countermeasure adjustment factors to predict diminishments in the effectiveness of road safety countermeasures when more than one countermeasure is chosen. A multiple countermeasure correction factors are figured for every 100-meter segment and for every accident type. However, restrictions of this methodology incorporate a presumable over-estimation in the predicted crash reduction. This study aims to adjust this correction factor by developing new models to calculate the effect of using multiple countermeasures on the number of fatalities for a location or an entire road. Regression models have been used to establish relationships between crash frequencies and the factors that affect their rates. Multiple linear regression, negative binomial regression, and Poisson regression techniques were used to develop models that can address the effectiveness of using multiple countermeasures. Analyses are conducted using The R Project for Statistical Computing showed that a model developed by negative binomial regression technique could give more reliable results of the predicted number of fatalities after the implementation of road safety multiple countermeasures than the results from iRAP model. The results also showed that the negative binomial regression approach gives more precise results in comparison with multiple linear and Poisson regression techniques because of the overdispersion and standard error issues.

Keywords: international road assessment program, negative binomial, road multiple countermeasures, road safety

Procedia PDF Downloads 239
6636 Development of Low Glycemic Gluten Free Bread from Barnyard Millet and Lentil Flour

Authors: Hemalatha Ganapathyswamy, Thirukkumar Subramani

Abstract:

Celiac disease is an autoimmune response to dietary wheat gluten. Gluten is the main structure forming protein in bread and hence developing gluten-free bread is a technological challenge. The study aims at using nonwheat flours like barnyard millet and lentil flour to replace wheat in bread formulations. Other characteristics of these grains, such as high protein, soluble fiber, mineral content and bioactive components make them attractive alternatives to traditional gluten-free ingredients in the production of high protein, gluten-free bread. The composite flour formulations for the development of gluten-free bread were optimized using lentil flour (50 to 70 g), barnyard millet flour (0 to 30 g) and corn flour (0 to 30 g) by means of response surface methodology with various independent variables for physical, sensorial and nutritional characteristics. The optimized composite flour which had a desirability value of 0.517, included lentil flour –62.94 g, barnyard millet flour– 24.34 g and corn flour– 12.72 g with overall acceptability score 8.00/9.00. The optimized gluten-free bread formulation had high protein (14.99g/100g) and fiber (1.95g/100g) content. The glycemic index of the gluten-free bread was 54.58 rendering it as low glycemic which enhances the functional benefit of the gluten-free bread. Since the standardised gluten-free bread from barnyard millet and lentil flour are high protein, and gluten-free with low glycemic index, the product would serve as an ideal therapeutic food in the management of both celiac disease and diabetes mellitus with better nutritional value.

Keywords: gluten free bread, lentil, low glycemic index, response surface methodology

Procedia PDF Downloads 187
6635 Reinforcement Learning for Robust Missile Autopilot Design: TRPO Enhanced by Schedule Experience Replay

Authors: Bernardo Cortez, Florian Peter, Thomas Lausenhammer, Paulo Oliveira

Abstract:

Designing missiles’ autopilot controllers have been a complex task, given the extensive flight envelope and the nonlinear flight dynamics. A solution that can excel both in nominal performance and in robustness to uncertainties is still to be found. While Control Theory often debouches into parameters’ scheduling procedures, Reinforcement Learning has presented interesting results in ever more complex tasks, going from videogames to robotic tasks with continuous action domains. However, it still lacks clearer insights on how to find adequate reward functions and exploration strategies. To the best of our knowledge, this work is a pioneer in proposing Reinforcement Learning as a framework for flight control. In fact, it aims at training a model-free agent that can control the longitudinal non-linear flight dynamics of a missile, achieving the target performance and robustness to uncertainties. To that end, under TRPO’s methodology, the collected experience is augmented according to HER, stored in a replay buffer and sampled according to its significance. Not only does this work enhance the concept of prioritized experience replay into BPER, but it also reformulates HER, activating them both only when the training progress converges to suboptimal policies, in what is proposed as the SER methodology. The results show that it is possible both to achieve the target performance and to improve the agent’s robustness to uncertainties (with low damage on nominal performance) by further training it in non-nominal environments, therefore validating the proposed approach and encouraging future research in this field.

Keywords: Reinforcement Learning, flight control, HER, missile autopilot, TRPO

Procedia PDF Downloads 263
6634 Economic Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis Pagone Emmanuele, Agbadede Roupa, Allison Isaiah

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero-emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, the current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbomachinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50% cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low-temperature heat exchanger LTHX (referred to by some authors as air preheater the mixed conductive membrane responsible for oxygen transfer and the high-temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout)–AZEP 85% (85% CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine–AZEP 85% (85% CO2 capture). This paper discusses monte carlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gas, fossil fuel power plants

Procedia PDF Downloads 396
6633 A Closer Look on Economic and Fiscal Incentives for Digital TV Industry

Authors: Yunita Anwar, Maya Safira Dewi

Abstract:

With the increasing importance on digital TV industry, there must be several incentives given to support the growth of the industry. Prior research have found mixed findings of economic and fiscal incentives to economic growth, which means these incentives do not necessarily boost the economic growth while providing support to a particular industry. Focusing on a setting of digital TV transition in Indonesia, this research will conduct document analysis to analyze incentives have been given in other country and incentives currently available in Indonesia. Our results recommend that VAT exemption and local tax incentives could be considered to be added to the incentives list available for digital TV industry.

Keywords: Digital TV transition, Economic Incentives, Fiscal Incentives, Policy.

Procedia PDF Downloads 322
6632 The Development of Cultural Routes: The Case of Greece

Authors: Elissavet Kosta

Abstract:

Introduction: In this research, we will propose the methodology, which is required for the planning of the cultural route in order to prepare substantiated proposals for the development and planning of cultural routes in Greece in the near future. Our research has started at 2016. Methodology in our research: Α combination of primary and secondary research will be used as project methodology. Furthermore, this study aims to follow a multidisciplinary approach, using dimensions of qualitative and quantitative data analysis models. Regarding the documentation of the theoretical part of the project, the method of secondary research will be mainly used, yet in combination with bibliographic sources. However, the data collection regarding the research topic will be conducted exclusively through primary research (questionnaires and interviews). Cultural Routes: The cultural route is defined as a brand name touristic product, that is a product of cultural tourism, which is shaped according to a specific connecting element. Given its potential, the cultural route is an important ‘tool’ for the management and development of cultural heritage. Currently, a constant development concerning the cultural routes is observed in an international level during the last decades, as it is widely accepted that cultural tourism has an important role in the world touristic industry. Cultural Routes in Greece: Especially for Greece, we believe, actions have not been taken to the systematic development of the cultural routes yet. The cultural routes that include Greece and have been design in a world scale as well as the cultural routes, which have been design in Greek ground up to this moment are initiations of the Council of Europe, World Tourism Organization UNWTO and ‘Diazoma’ association. Regarding the study of cultural routes in Greece as a multidimensional concept, the following concerns have arisen: Firstly, we are concerned about the general impact of cultural routes at local and national level and specifically in the economic sector. Moreover, we deal with the concerns regarding the natural environment and we delve into the educational aspect of cultural routes in Greece. In addition, the audience we aim at is both specific and broad and we put forward the institutional framework of the study. Finally, we conduct the development and planning of new cultural routes, having in mind the museums as both the starting and ending point of a route. Conclusion: The contribution of our work is twofold and lies firstly on the fact that we attempt to create cultural routes in Greece and secondly on the fact that an interdisciplinary approach is engaged towards realizing our study objective. In particular, our aim is to take advantage of all the ways in which the promotion of a cultural route can have a positive influence on the way of life of society. As a result, we intend to analyze how a cultural route can turn into a well-organized activity that can be used as social intervention to develop tourism, strengthen the economy and improve access to cultural goods in Greece during the economic crisis.

Keywords: cultural heritage, cultural routes, cultural tourism, Greece

Procedia PDF Downloads 233
6631 Isolation, Purification and Characterisation of Non-Digestible Oligosaccharides Derived from Extracellular Polysaccharide of Antarctic Fungus Thelebolus Sp. IITKGP-BT12

Authors: Abinaya Balasubramanian, Satyabrata Ghosh, Satyahari Dey

Abstract:

Non-Digestible Oligosaccharides(NDOs) are low molecular weight carbohydrates with degree of polymerization (DP) 3-20, that are delivered intact to the large intestine. NDOs are gaining attention as effective prebiotic molecules that facilitate prevention and treatment of several chronic diseases. Recently, NDOs are being obtained by cleaving complex polysaccharides as it results in high yield and also as the former tend to display greater bioactivity. Thelebolus sp. IITKGP BT-12, a recently identified psychrophilic, Ascomycetes fungus has been reported to produce a bioactive extracellular polysaccharide(EPS). The EPS has been proved to possess strong prebiotic activity and anti- proliferative effects. The current study is an attempt to identify and optimise the most suitable method for hydrolysis of the above mentioned novel EPS into NDOs, and further purify and characterise the same. Among physical, chemical and enzymatic methods, enzymatic hydrolysis was identified as the best method and the optimum hydrolysis conditions obtained using response surface methodology were: reaction time of 24h, β-(1,3) endo-glucanase concentration of 0.53U and substrate concentration of 10 mg/ml. The NDOs were purified using gel filtration chromatography and their molecular weights were determined using MALDI-TOF. The major fraction was found to have a DP of 7,8. The monomeric units of the NDOs were confirmed to be glucose using TLC and GCMS-MS analysis. The obtained oligosaccharides proved to be non-digestible when subjected to gastric acidity, salivary and pancreatic amylases and hence could serve as efficient prebiotics.

Keywords: characterisation, enzymatic hydrolysis, non-digestible oligosaccharides, response surface methodology

Procedia PDF Downloads 127
6630 Optimization of Reliability Test Plans: Increase Wafer Fabrication Equipments Uptime

Authors: Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta, Ahmed Zeouita

Abstract:

Semiconductor processing chambers tend to operate in controlled but aggressive operating conditions (chemistry, plasma, high temperature etc.) Owing to this, the design of this equipment requires developing robust and reliable hardware and software. Any equipment downtime due to reliability issues can have cost implications both for customers in terms of tool downtime (reduced throughput) and for equipment manufacturers in terms of high warranty costs and customer trust deficit. A thorough reliability assessment of critical parts and a plan for preventive maintenance/replacement schedules need to be done before tool shipment. This helps to save significant warranty costs and tool downtimes in the field. However, designing a proper reliability test plan to accurately demonstrate reliability targets with proper sample size and test duration is quite challenging. This is mainly because components can fail in different failure modes that fit into different Weibull beta value distributions. Without apriori Weibull beta of a failure mode under consideration, it always leads to over/under utilization of resources, which eventually end up in false positives or false negatives estimates. This paper proposes a methodology to design a reliability test plan with optimal model size/duration/both (independent of apriori Weibull beta). This methodology can be used in demonstration tests and can be extended to accelerated life tests to further decrease sample size/test duration.

Keywords: reliability, stochastics, preventive maintenance

Procedia PDF Downloads 12
6629 Microstructure and Mechanical Properties Evaluation of Graphene-Reinforced AlSi10Mg Matrix Composite Produced by Powder Bed Fusion Process

Authors: Jitendar Kumar Tiwari, Ajay Mandal, N. Sathish, A. K. Srivastava

Abstract:

Since the last decade, graphene achieved great attention toward the progress of multifunction metal matrix composites, which are highly demanded in industries to develop energy-efficient systems. This study covers the two advanced aspects of the latest scientific endeavor, i.e., graphene as reinforcement in metallic materials and additive manufacturing (AM) as a processing technology. Herein, high-quality graphene and AlSi10Mg powder mechanically mixed by very low energy ball milling with 0.1 wt. % and 0.2 wt. % graphene. Mixed powder directly subjected to the powder bed fusion process, i.e., an AM technique to produce composite samples along with bare counterpart. The effects of graphene on porosity, microstructure, and mechanical properties were examined in this study. The volumetric distribution of pores was observed under X-ray computed tomography (CT). On the basis of relative density measurement by X-ray CT, it was observed that porosity increases after graphene addition, and pore morphology also transformed from spherical pores to enlarged flaky pores due to improper melting of composite powder. Furthermore, the microstructure suggests the grain refinement after graphene addition. The columnar grains were able to cross the melt pool boundaries in case of the bare sample, unlike composite samples. The smaller columnar grains were formed in composites due to heterogeneous nucleation by graphene platelets during solidification. The tensile properties get affected due to induced porosity irrespective of graphene reinforcement. The optimized tensile properties were achieved at 0.1 wt. % graphene. The increment in yield strength and ultimate tensile strength was 22% and 10%, respectively, for 0.1 wt. % graphene reinforced sample in comparison to bare counterpart while elongation decreases 20% for the same sample. The hardness indentations were taken mostly on the solid region in order to avoid the collapse of the pores. The hardness of the composite was increased progressively with graphene content. Around 30% of increment in hardness was achieved after the addition of 0.2 wt. % graphene. Therefore, it can be concluded that powder bed fusion can be adopted as a suitable technique to develop graphene reinforced AlSi10Mg composite. Though, some further process modification required to avoid the induced porosity after the addition of graphene, which can be addressed in future work.

Keywords: graphene, hardness, porosity, powder bed fusion, tensile properties

Procedia PDF Downloads 126
6628 Prediction of Product Size Distribution of a Vertical Stirred Mill Based on Breakage Kinetics

Authors: C. R. Danielle, S. Erik, T. Patrick, M. Hugh

Abstract:

In the last decade there has been an increase in demand for fine grinding due to the depletion of coarse-grained orebodies and an increase of processing fine disseminated minerals and complex orebodies. These ores have provided new challenges in concentrator design because fine and ultra-fine grinding is required to achieve acceptable recovery rates. Therefore, the correct design of a grinding circuit is important for minimizing unit costs and increasing product quality. The use of ball mills for grinding in fine size ranges is inefficient and, therefore, vertical stirred grinding mills are becoming increasingly popular in the mineral processing industry due to its already known high energy efficiency. This work presents a hypothesis of a methodology to predict the product size distribution of a vertical stirred mill using a Bond ball mill. The Population Balance Model (PBM) was used to empirically analyze the performance of a vertical mill and a Bond ball mill. The breakage parameters obtained for both grinding mills are compared to determine the possibility of predicting the product size distribution of a vertical mill based on the results obtained from the Bond ball mill. The biggest advantage of this methodology is that most of the minerals processing laboratories already have a Bond ball mill to perform the tests suggested in this study. Preliminary results show the possibility of predicting the performance of a laboratory vertical stirred mill using a Bond ball mill.

Keywords: bond ball mill, population balance model, product size distribution, vertical stirred mill

Procedia PDF Downloads 291
6627 Investigating the Indoor Air Quality of the Respiratory Care Wards

Authors: Yu-Wen Lin, Chin-Sheng Tang, Wan-Yi Chen

Abstract:

Various biological specimens, drugs, and chemicals exist in the hospital. The medical staffs and hypersensitive inpatients expose might expose to multiple hazards while they work or stay in the hospital. Therefore, the indoor air quality (IAQ) of the hospital should be paid more attention. Respiratory care wards (RCW) are responsible for caring the patients who cannot spontaneously breathe without the ventilators. The patients in RCW are easy to be infected. Compared to the bacteria concentrations of other hospital units, RCW came with higher values in other studies. This research monitored the IAQ of the RCW and checked the compliances of the indoor air quality standards of Taiwan Indoor Air Quality Act. Meanwhile, the influential factors of IAQ and the impacts of ventilator modules, with humidifier or with filter, were investigated. The IAQ of two five-bed wards and one nurse station of a RCW in a regional hospital were monitored. The monitoring was proceeded for 16 hours or 24 hours during the sampling days with a sampling frequency of 20 minutes per hour. The monitoring was performed for two days in a row and the AIQ of the RCW were measured for eight days in total. The concentrations of carbon dioxide (CO₂), carbon monoxide (CO), particulate matter (PM), nitrogen oxide (NOₓ), total volatile organic compounds (TVOCs), relative humidity (RH) and temperature were measured by direct reading instruments. The bioaerosol samples were taken hourly. The hourly air change rate (ACH) was calculated by measuring the air ventilation volume. Human activities were recorded during the sampling period. The linear mixed model (LMM) was applied to illustrate the impact factors of IAQ. The concentrations of CO, CO₂, PM, bacterial and fungi exceeded the Taiwan IAQ standards. The major factors affecting the concentrations of CO, PM₁ and PM₂.₅ were location and the number of inpatients. The significant factors to alter the CO₂ and TVOC concentrations were location and the numbers of in-and-out staff and inpatients. The number of in-and-out staff and the level of activity affected the PM₁₀ concentrations statistically. The level of activity and the numbers of in-and-out staff and inpatients are the significant factors in changing the bacteria and fungi concentrations. Different models of the patients’ ventilators did not affect the IAQ significantly. The results of LMM can be utilized to predict the pollutant concentrations under various environmental conditions. The results of this study would be a valuable reference for air quality management of RCW.

Keywords: respiratory care ward, indoor air quality, linear mixed model, bioaerosol

Procedia PDF Downloads 106
6626 Effects of Bleaching Procedures on Dentine Sensitivity

Authors: Suhayla Reda Al-Banai

Abstract:

Problem Statement: Tooth whitening was used for over one hundred and fifty year. The question concerning the whiteness of teeth is a complex one since tooth whiteness will vary from individual to individual, dependent on age and culture, etc. Tooth whitening following treatment may be dependent on the type of whitening system used to whiten the teeth. There are a few side-effects to the process, and these include tooth sensitivity and gingival irritation. Some individuals may experience no pain or sensitivity following the procedure. Purpose: To systematically review the available published literature until 31st December 2021 to identify all relevant studies for inclusion and to determine whether there was any evidence demonstrating that the application of whitening procedures resulted in the tooth sensitivity. Aim: Systematically review the available published works of literature to identify all relevant studies for inclusion and to determine any evidence demonstrating that application of 10% & 15% carbamide peroxide in tooth whitening procedures resulted in tooth sensitivity. Material and Methods: Following a review of 70 relevant papers from searching both electronic databases (OVID MEDLINE and PUBMED) and hand searching of relevant written journals, 49 studies were identified, 42 papers were subsequently excluded, and 7 studies were finally accepted for inclusion. The extraction of data for inclusion was conducted by two reviewers. The main outcome measures were the methodology and assessment used by investigators to evaluate tooth sensitivity in tooth whitening studies. Results: The reported evaluation of tooth sensitivity during tooth whitening procedures was based on the subjective response of subjects rather than a recognized methodology for evaluating. One of the problems in evaluating was the lack of homogeneity in study design. Seven studies were included. The studies included essential features namely: randomized group, placebo controls, doubleblind and single-blind. Drop-out was obtained from two of included studies. Three of the included studies reported sensitivity at the baseline visit. Two of the included studies mentioned the exclusion criteria Conclusions: The results were inconclusive due to: Limited number of included studies, the study methodology, and evaluation of DS reported. Tooth whitening procedures adversely affect both hard and soft tissues in the oral cavity. Sideeffects are mild and transient in nature. Whitening solutions with greater than 10% carbamide peroxide causes more tooth sensitivity. Studies using nightguard vital bleaching with 10% carbamide peroxide reported two side effects tooth sensitivity and gingival irritation, although tooth sensitivity was more prevalent than gingival irritation

Keywords: dentine, sensitivity, bleaching, carbamide peroxde

Procedia PDF Downloads 69
6625 Possibilities to Evaluate the Climatic and Meteorological Potential for Viticulture in Poland: The Case Study of the Jagiellonian University Vineyard

Authors: Oskar Sekowski

Abstract:

Current global warming causes changes in the traditional zones of viticulture worldwide. During 20th century, the average global air temperature increased by 0.89˚C. The models of climate change indicate that viticulture, currently concentrating in narrow geographic niches, may move towards the poles, to higher geographic latitudes. Global warming may cause changes in traditional viticulture regions. Therefore, there is a need to estimate the climatic conditions and climate change in areas that are not traditionally associated with viticulture, e.g., Poland. The primary objective of this paper is to prepare methodology to evaluate the climatic and meteorological potential for viticulture in Poland based on a case study. Moreover, the additional aim is to evaluate the climatic potential of a mesoregion where a university vineyard is located. The daily data of temperature, precipitation, insolation, and wind speed (1988-2018) from the meteorological station located in Łazy, southern Poland, was used to evaluate 15 climatological parameters and indices connected with viticulture. The next steps of the methodology are based on Geographic Information System methods. The topographical factors such as a slope gradient and slope exposure were created using Digital Elevation Models. The spatial distribution of climatological elements was interpolated by ordinary kriging. The values of each factor and indices were also ranked and classified. The viticultural potential was determined by integrating two suitability maps, i.e., the topographical and climatic ones, and by calculating the average for each pixel. Data analysis shows significant changes in heat accumulation indices that are driven by increases in maximum temperature, mostly increasing number of days with Tmax > 30˚C. The climatic conditions of this mesoregion are sufficient for vitis vinifera viticulture. The values of indicators and insolation are similar to those in the known wine regions located on similar geographical latitudes in Europe. The smallest threat to viticulture in study area is the occurrence of hail and the highest occurrence of frost in the winter. This research provides the basis for evaluating general suitability and climatologic potential for viticulture in Poland. To characterize the climatic potential for viticulture, it is necessary to assess the suitability of all climatological and topographical factors that can influence viticulture. The methodology used in this case study shows places where there is a possibility to create vineyards. It may also be helpful for wine-makers to select grape varieties.

Keywords: climatologic potential, climatic classification, Poland, viticulture

Procedia PDF Downloads 105
6624 Optimization Approach to Integrated Production-Inventory-Routing Problem for Oxygen Supply Chains

Authors: Yena Lee, Vassilis M. Charitopoulos, Karthik Thyagarajan, Ian Morris, Jose M. Pinto, Lazaros G. Papageorgiou

Abstract:

With globalisation, the need to have better coordination of production and distribution decisions has become increasingly important for industrial gas companies in order to remain competitive in the marketplace. In this work, we investigate a problem that integrates production, inventory, and routing decisions in a liquid oxygen supply chain. The oxygen supply chain consists of production facilities, external third-party suppliers, and multiple customers, including hospitals and industrial customers. The product produced by the plants or sourced from the competitors, i.e., third-party suppliers, is distributed by a fleet of heterogenous vehicles to satisfy customer demands. The objective is to minimise the total operating cost involving production, third-party, and transportation costs. The key decisions for production include production and inventory levels and product amount from third-party suppliers. In contrast, the distribution decisions involve customer allocation, delivery timing, delivery amount, and vehicle routing. The optimisation of the coordinated production, inventory, and routing decisions is a challenging problem, especially when dealing with large-size problems. Thus, we present a two-stage procedure to solve the integrated problem efficiently. First, the problem is formulated as a mixed-integer linear programming (MILP) model by simplifying the routing component. The solution from the first-stage MILP model yields the optimal customer allocation, production and inventory levels, and delivery timing and amount. Then, we fix the previous decisions and solve a detailed routing. In the second stage, we propose a column generation scheme to address the computational complexity of the resulting detailed routing problem. A case study considering a real-life oxygen supply chain in the UK is presented to illustrate the capability of the proposed models and solution method. Furthermore, a comparison of the solutions from the proposed approach with the corresponding solutions provided by existing metaheuristic techniques (e.g., guided local search and tabu search algorithms) is presented to evaluate the efficiency.

Keywords: production planning, inventory routing, column generation, mixed-integer linear programming

Procedia PDF Downloads 111
6623 Evaluation of Prestressed Reinforced Concrete Slab Punching Shear Using Finite Element Method

Authors: Zhi Zhang, Liling Cao, Seyedbabak Momenzadeh, Lisa Davey

Abstract:

Reinforced concrete (RC) flat slab-column systems are commonly used in residential or office buildings, as the flat slab provides efficient clearance resulting in more stories at a given height than regular reinforced concrete beam-slab system. Punching shear of slab-column joints is a critical component of two-way reinforced concrete flat slab design. The unbalanced moment at the joint is transferred via slab moment and shear forces. ACI 318 provides an equation to evaluate the punching shear under the design load. It is important to note that the design code considers gravity and environmental load when considering the design load combinations, while it does not consider the effect from differential foundation settlement, which may be a governing load condition for the slab design. This paper describes how prestressed reinforced concrete slab punching shear is evaluated based on ACI 318 provisions and finite element analysis. A prestressed reinforced concrete slab under differential settlements is studied using the finite element modeling methodology. The punching shear check equation is explained. The methodology to extract data for punching shear check from the finite element model is described and correlated with the corresponding code provisions. The study indicates that the finite element analysis results should be carefully reviewed and processed in order to perform accurate punching shear evaluation. Conclusions are made based on the case studies to help engineers understand the punching shear behavior in prestressed and non-prestressed reinforced concrete slabs.

Keywords: differential settlement, finite element model, prestressed reinforced concrete slab, punching shear

Procedia PDF Downloads 128