Search results for: direct translation approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17139

Search results for: direct translation approach

15459 Removal of Problematic Organic Compounds from Water and Wastewater Using the Arvia™ Process

Authors: Akmez Nabeerasool, Michaelis Massaros, Nigel Brown, David Sanderson, David Parocki, Charlotte Thompson, Mike Lodge, Mikael Khan

Abstract:

The provision of clean and safe drinking water is of paramount importance and is a basic human need. Water scarcity coupled with tightening of regulations and the inability of current treatment technologies to deal with emerging contaminants and Pharmaceuticals and personal care products means that alternative treatment technologies that are viable and cost effective are required in order to meet demand and regulations for clean water supplies. Logistically, the application of water treatment in rural areas presents unique challenges due to the decentralisation of abstraction points arising from low population density and the resultant lack of infrastructure as well as the need to treat water at the site of use. This makes it costly to centralise treatment facilities and hence provide potable water direct to the consumer. Furthermore, across the UK there are segments of the population that rely on a private water supply which means that the owner or user(s) of these supplies, which can serve one household to hundreds, are responsible for the maintenance. The treatment of these private water supply falls on the private owners, and it is imperative that a chemical free technological solution that can operate unattended and does not produce any waste is employed. Arvia’s patented advanced oxidation technology combines the advantages of adsorption and electrochemical regeneration within a single unit; the Organics Destruction Cell (ODC). The ODC uniquely uses a combination of adsorption and electrochemical regeneration to destroy organics. Key to this innovative process is an alternative approach to adsorption. The conventional approach is to use high capacity adsorbents (e.g. activated carbons with high porosities and surface areas) that are excellent adsorbents, but require complex and costly regeneration. Arvia’s technology uses a patent protected adsorbent, Nyex™, which is a non-porous, highly conductive, graphite based adsorbent material that enables it to act as both the adsorbent and as a 3D electrode. Adsorbed organics are oxidised and the surface of the Nyex™ is regenerated in-situ for further adsorption without interruption or replacement. Treated water flows from the bottom of the cell where it can either be re-used or safely discharged. Arvia™ Technology Ltd. has trialled the application of its tertiary water treatment technology in treating reservoir water abstracted near Glasgow, Scotland, with promising results. Several other pilot plants have also been successfully deployed at various locations in the UK showing the suitability and effectiveness of the technology in removing recalcitrant organics (including pharmaceuticals, steroids and hormones), COD and colour.

Keywords: Arvia™ process, adsorption, water treatment, electrochemical oxidation

Procedia PDF Downloads 264
15458 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach

Authors: Kristina Pflug, Markus Busch

Abstract:

Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.

Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology

Procedia PDF Downloads 125
15457 Teachers’ Reactions, Learning, Organizational Support, and Use of Lesson Study for Transformative Assessment

Authors: Melaku Takele Abate, Abbi Lemma Wodajo, Adula Bekele Hunde

Abstract:

This study aimed at exploring mathematics teachers' reactions, learning, school leaders’ support, and use of the Lesson Study for Transformative Assessment (LSforTA) program ideas in practice. The LSforTA program was new, and therefore, a local and grounded approach was needed to examine teachers’ knowledge and skills acquired using LSforTA. So, a design-based research approach was selected to evaluate and refine the LSforTA approach. The results showed that LSforTA increased teachers' knowledge and use of different levels of mathematics assessment tasks. The program positively affected teachers' practices of transformative assessment and enhanced their knowledge and skills in assessing students in a transformative way. The paper concludes how the LSforTA procedures were adapted in response to this evaluation and provides suggestions for future development and research.

Keywords: classroom assessment, feedback practices, lesson study, mathematics, design-based research

Procedia PDF Downloads 55
15456 Towards an Indigenous Language Policy for National Integration

Authors: Odoh Dickson Akpegi

Abstract:

The paper is about the need for an indigenous language in order to meaningfully harness both our human and material resources for the nation’s integration. It then examines the notty issue of the national language question and advocates a piece meal approach in solving the problem. This approach allows for the development and use of local languages in minority areas, especially in Benue State, as a way of preparing them for consideration as possible replacement for English language as Nigeria’s national or official language. Finally, an arrangement to follow to prepare the languages for such competition at the national level is presented.

Keywords: indigenous language, English language, official language, National integration

Procedia PDF Downloads 561
15455 CO2 Emission and Cost Optimization of Reinforced Concrete Frame Designed by Performance Based Design Approach

Authors: Jin Woo Hwang, Byung Kwan Oh, Yousok Kim, Hyo Seon Park

Abstract:

As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.

Keywords: CO2 emissions, performance based design, optimization, sustainable design

Procedia PDF Downloads 407
15454 Sustainable Approach in Textile and Apparel Industry: Case Study Applied to a Medium Enterprise

Authors: Maged Kamal

Abstract:

Previous research papers have suggested that enhancing the environmental performance in textiles and apparel industry would affect positively on the overall enterprise competitiveness. However, there is a gap in the literature regarding simplifying the available theory to get it practically implemented with more confidence of the expected results, especially for small and medium enterprises. The aim of this paper is to simplify and best use of the concerned international norms to produce a systematic approach that could be used as a guideline for practical application of the main sustainable principles in medium size textile business. The increasing in efficiency which has been resulted from the implementation of the suggested approach/model originated from reduction in raw materials usage, energy, and water savings, in addition to the risk reduction for the people and the environment. The practical case study has been implemented in a textile factory producing knitted fabrics, readymade garments, dyed and printed fabrics. The results were analyzed to examine the effect of the suggested change on the enterprise profitability.

Keywords: apparel industry, environmental management, sustainability, textiles

Procedia PDF Downloads 295
15453 The Role of Psychosis Proneness in the Association of Metacognition with Psychological Distress in Non-Clinical Population

Authors: Usha Barahmand, Ruhollah Heydari Sheikh Ahmad

Abstract:

Distress refers to an unpleasant metal state or emotional suffering marked by negative affect such as depression (e.g., lost interest; sadness; hopelessness), anxiety (e.g., restlessness; feeling tense). These negative affect have been mostly suggested to be concomitant of metal disorders such as positive psychosis symptoms and also of proneness to psychotic features in non-clinical population. Psychotic features proneness including hallucination, delusion and schizotypal traits, have been found to be associated with metacognitive beliefs. Metacognition has been conceptualized as ‘thinking about thoughts, monitoring and controlling of cognitive processes’. The aim of the current study was to investigate the role of psychosis proneness in the association of metacognitions and distress. We predicted psychosis proneness would mediate the association of metacognitive beliefs and the distress. A sample of 420 university students was randomly recruited to endorse questionnaires of the study that consisted of DASS-21questionnaire for assessing levels of distress, Cartwright–Hatton & Wells, Meta-cognitions Questionnaire (MCQ-30) for assessing metacognitive beliefs, Launay-Slade Hallucination Scale-revised (LSHS-R), Peters et al. Delusions Inventory, Schizotypal Personality Questionnaire-Brief. Conducting a bootstrapping approach in order to investigate our hypothesis, the result showed that there was no a direct association between metacognitive dimensions and psychological distress and psychosis proneness significantly mediated the association. Finding suggested that individuals with dysfunctional metacognitive beliefs experience high levels of distress if they are prone to psychosis symptoms. In other words, psychosis proneness is a path through which individuals with dysfunctional metacognitions experience high levels of psychological distress.

Keywords: metacognition, non-clinical population, psychological distress, psychosis proneness

Procedia PDF Downloads 341
15452 Function of Quranic Family Lifestyle in the Development of Modern Islamic Civilization

Authors: Zeinabossadat Hosseini, Fateme Qorbani

Abstract:

The universal community has suffered from the lack of a sustainable and prosperous civilization in the world, and human distance from religious doctrines exposes the civilization of society to decline and collapse. To build a sustainable Islamic civilization, it is essential to understand and strengthen the core foundations of this subject. Islam, which claims to be integral in ensuring human prosperity and the creation of a new Islamic civilization, it can only develop this civilization if it has the necessary foundations. On the other hand, the family is one of the most important and effective foundations for effective individual and community life, and according to the influential role of the family on human behavioral and cognitive domains, it is impossible to define and explain the development of Islamic civilization without regard to the family category. The family can be effective in this important matter through its direct and indirect role in the education of individuals, and its members have the highest interaction and impact on each other. Development of the teachings of Islam in the form of verses and traditions can guide people towards the Islamic lifestyle and thus Islamic civilization and the Pure life (Tayyibah life-Al-Nahl/97). This article provides a descriptive-analytical approach to the conclusion that modern Islamic civilization promises the prosperity of the world and the hereafter. It will bring peace and prosperity to the world as well as advancement, fight against poverty, unity, and solidarity of Muslims, preservation of human dignity, as well as the growth of spirituality. It can also be deduced that the foundations of Islamic civilization in Qur'anic Tayyibah life and in today's term, the Islamic lifestyle, can be identified and implemented in the family structure, And the components of this blissful life can be found in this focus. The Tayyibah life will be realized by relying on the right faith and practice, paying attention to the rulings, divine command mentioned in the verses, as well as the traditions, altruism, nurturing a commitment to the community.

Keywords: family, development of modern Islamic civilization, quranic lifestyle, Tayyibah life

Procedia PDF Downloads 151
15451 Progressive Participatory Observation Applied to Priority Neighbourhoods

Authors: Serge Rohmer

Abstract:

This paper proposes a progressive participatory observation that can be used as a sociological investigation within communities. The usefulness of participant observation in sociological projects is first asserted, particularly in an urban context. Competencies, know-how and interpersonal skills are then explained before to detail the progressive approach, consisting of four levels of observation. The progressive participatory observation is applied to an experimental project to set up a permaculture urban micro-farm with residents of a priority neighbourhood. Feedback on the experiment has identified several key recommendations for implementing the approach.

Keywords: participatory observation, observation scale, priority neighbourhood, urban sociology

Procedia PDF Downloads 28
15450 Dynamic Cardiac Mitochondrial Proteome Alterations after Ischemic Preconditioning

Authors: Abdelbary Prince, Said Moussa, Hyungkyu Kim, Eman Gouda, Jin Han

Abstract:

We compared the dynamic alterations of mitochondrial proteome of control, ischemia-reperfusion (IR) and ischemic preconditioned (IPC) rabbit hearts. Using 2-DE, we identified 29 mitochondrial proteins that were differentially expressed in the IR heart compared with the control and IPC hearts. For two of the spots, the expression patterns were confirmed by Western blotting analysis. These proteins included succinate dehydrogenase complex, Acyl-CoA dehydrogenase, carnitine acetyltransferase, dihydrolipoamide dehydrogenase, Atpase, ATP synthase, dihydrolipoamide succinyltransferase, ubiquinol-cytochrome c reductase, translation elongation factor, acyl-CoA dehydrogenase, actin alpha, succinyl-CoA Ligase, dihydrolipoamide S-succinyltransferase, citrate synthase, acetyl-Coenzyme A dehydrogenase, creatine kinase, isocitrate dehydrogenase, pyruvate dehydrogenase, prohibitin, NADH dehydrogenase (ubiquinone) Fe-S protein, enoyl Coenzyme A hydratase, superoxide dismutase [Mn], and 24-kDa subunit of complex I. Interestingly, most of these proteins are associated with the mitochondrial respiratory chain, antioxidant enzyme system, and energy metabolism. The results provide clues as to the cardioprotective mechanism of ischemic preconditioning at the protein level and may serve as potential biomarkers for detection of ischemia-induced cardiac injury.

Keywords: ischemic preconditioning, mitochondria, proteome, cardioprotection

Procedia PDF Downloads 350
15449 Implementation of a Lattice Boltzmann Method for Pulsatile Flow with Moment Based Boundary Condition

Authors: Zainab A. Bu Sinnah, David I. Graham

Abstract:

The Lattice Boltzmann Method has been developed and used to simulate both steady and unsteady fluid flow problems such as turbulent flows, multiphase flow and flows in the vascular system. As an example, the study of blood flow and its properties can give a greater understanding of atherosclerosis and the flow parameters which influence this phenomenon. The blood flow in the vascular system is driven by a pulsating pressure gradient which is produced by the heart. As a very simple model of this, we simulate plane channel flow under periodic forcing. This pulsatile flow is essentially the standard Poiseuille flow except that the flow is driven by the periodic forcing term. Moment boundary conditions, where various moments of the particle distribution function are specified, are applied at solid walls. We used a second-order single relaxation time model and investigated grid convergence using two distinct approaches. In the first approach, we fixed both Reynolds and Womersley numbers and varied relaxation time with grid size. In the second approach, we fixed the Womersley number and relaxation time. The expected second-order convergence was obtained for the second approach. For the first approach, however, the numerical method converged, but not necessarily to the appropriate analytical result. An explanation is given for these observations.

Keywords: Lattice Boltzmann method, single relaxation time, pulsatile flow, moment based boundary condition

Procedia PDF Downloads 231
15448 Exploring the Effect of Using Lesh Model in Enhancing Prospective Mathematics Teachers’ Number Sense

Authors: Areej Isam Barham

Abstract:

Developing students’ number sense is an essential element in the learning of mathematics. Number sense is one of the foundational ideas in mathematics where students need to understand numbers, representing them in different ways, and realize the relationships among numbers. Number sense also reflects students’ understanding of the meaning of operations, how they related to one another, how to compute fluently and make reasonable estimates. Developing students’ number sense in the mathematics classroom requires good preparation for mathematics teachers, those who will direct their students towards the real understanding of numbers and its implementation in the learning of mathematics. This study describes the development of elementary prospective mathematics teachers’ number sense through a mathematics teaching methods course at Qatar University. The study examined the effect of using the Lesh model in enhancing mathematics prospective teachers’ number sense. Thirty-nine elementary prospective mathematics teachers involved in the current study. The study followed an experimental research approach, and quantitative research methods were used to answer the research questions. Pre-post number sense test was constructed and implemented before and after teaching by using the Lesh model. Data were analyzed using Statistical Packages for Social Sciences (SPSS). Descriptive data analysis and t-test were used to examine the impact of using the Lesh model in enhancing prospective teachers’ number sense. Finding of the study indicated poor number sense and limited numeracy skills before implementing the use of the Lesh model, which highly demonstrate the importance of the study. The results of the study also revealed a positive impact on the use of the Lesh model in enhancing prospective teachers’ number sense with statistically significant differences. The discussion of the study addresses different features and issues related to the participants’ number sense. In light of the study, the research presents recommendations and suggestions for the future development of mathematics prospective teachers’ number sense.

Keywords: number sense, Lesh model, prospective mathematics teachers, development of number sense

Procedia PDF Downloads 141
15447 A Socio-Technical Approach to Cyber-Risk Assessment

Authors: Kitty Kioskli, Nineta Polemi

Abstract:

Evaluating the levels of cyber-security risks within an enterprise is most important in protecting its information system, services and all its digital assets against security incidents (e.g. accidents, malicious acts, massive cyber-attacks). The existing risk assessment methodologies (e.g. eBIOS, OCTAVE, CRAMM, NIST-800) adopt a technical approach considering as attack factors only the capability, intention and target of the attacker, and not paying attention to the attacker’s psychological profile and personality traits. In this paper, a socio-technical approach is proposed in cyber risk assessment, in order to achieve more realistic risk estimates by considering the personality traits of the attackers. In particular, based upon principles from investigative psychology and behavioural science, a multi-dimensional, extended, quantifiable model for an attacker’s profile is developed, which becomes an additional factor in the cyber risk level calculation.

Keywords: attacker, behavioural models, cyber risk assessment, cybersecurity, human factors, investigative psychology, ISO27001, ISO27005

Procedia PDF Downloads 167
15446 A Non-parametric Clustering Approach for Multivariate Geostatistical Data

Authors: Francky Fouedjio

Abstract:

Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.

Keywords: clustering, geostatistics, multivariate data, non-parametric

Procedia PDF Downloads 477
15445 The Significance of a Well-Defined Systematic Approach in Risk Management for Construction Projects within Oil Industry

Authors: Batool Ismaeel, Umair Farooq, Saad Mushtaq

Abstract:

Construction projects in the oil industry can be very complex, having unknown outcomes and uncertainties that cannot be easily predicted. Each project has its unique risks generated by a number of factors which, if not controlled, will impact the successful completion of the project mainly in terms of schedule, cost, quality, and safety. This paper highlights the historic risks associated with projects in the south and east region of Kuwait Oil Company (KOC) collated from the company’s lessons learned database. Starting from Contract Award through to handover of the project to the Asset owner, the gaps in project execution in terms of managing risk will be brought to discussion and where a well-defined systematic approach in project risk management reflecting many claims, change of scope, exceeding budget, delays in engineering phase as well as in the procurement and fabrication of long lead items should be adopted. This study focuses on a proposed feasible approach in risk management for engineering, procurement and construction (EPC) level projects including the various stakeholders involved in executing the works from International to local contractors and vendors in KOC. The proposed approach covers the areas categorized into organizational, design, procurement, construction, pre-commissioning, commissioning and project management in which the risks are identified and require management and mitigation. With the effective deployment and implementation of the proposed risk management system and the consideration of it as a vital key in achieving the project’s target, the outcomes will be more predictable in the future, and the risk triggers will be managed and controlled. The correct resources can be allocated on a timely basis for the company for avoiding any unpredictable outcomes during the execution of the project. It is recommended in this paper to apply this risk management approach as an integral part of project management and investigate further in the future, the effectiveness of this proposed system for newly awarded projects and compare the same with those projects of similar budget/complexity that have not applied this approach to risk management.

Keywords: construction, project completion, risk management, uncertainties

Procedia PDF Downloads 155
15444 Supporting Women's Economic Development in Rural Papua New Guinea

Authors: Katja Mikhailovich, Barbara Pamphilon

Abstract:

Farmer training in Papua New Guinea has focused mainly on technology transfer approaches. This has primarily benefited men and often excluded women whose literacy, low education and role in subsistence crops has precluded participation in formal training. The paper discusses an approach that uses both a brokerage model of agricultural extension to link smallholders with private sector agencies and an innovative family team’s approach that aims to support the economic empowerment of women in families and encourages sustainable and gender equitable farming and business practices.

Keywords: women, economic development, agriculture, training

Procedia PDF Downloads 394
15443 Multiscale Modeling of Damage in Textile Composites

Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese

Abstract:

Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.

Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites

Procedia PDF Downloads 354
15442 Information and Cooperativity in Fiction: The Pragmatics of David Baboulene’s “Knowledge Gaps”

Authors: Cara DiGirolamo

Abstract:

In his 2017 Ph.D. thesis, script doctor David Baboulene presented a theory of fiction in which differences in the knowledge states between participants in a literary experience, including reader, author, and characters, create many story elements, among them suspense, expectations, subtext, theme, metaphor, and allegory. This theory can be adjusted and modeled by incorporating a formal pragmatic approach that understands narrative as a speech act with a conversational function. This approach requires both the Speaker and the Listener to be understood as participants in the discourse. It also uses theories of cooperativity and the QUD to identify the existence of implicit questions. This approach predicts that what an effective literary narrative must do: provide a conversational context early in the story so the reader can engage with the text as a conversational participant. In addition, this model incorporates schema theory. Schema theory is a cognitive model for learning and processing information about the world and transforming it into functional knowledge. Using this approach can extend the QUD model. Instead of describing conversation as a form of information gathering restricted to question-answer sets, the QUD can include knowledge modeling and understanding as a possible outcome of a conversation. With this model, Baboulene’s “Knowledge Gaps” can provide real insight into storytelling as a conversational move, and extend the QUD to be able to simply and effectively apply to a more diverse set of conversational interactions and also to narrative texts.

Keywords: literature, speech acts, QUD, literary theory

Procedia PDF Downloads 19
15441 Whole Body Vibration and Low Back Disorder among Saskatchewan Farmers: A Prospective Cohort Study

Authors: Samuel Kwaku Essien, Catherine Trask, Niels Koehncke, Brenna Bath

Abstract:

Background: Low back disorder (LBD) is the most common musculoskeletal problem among farmers, with higher prevalence than other occupations. Operators of tractors and other farm machinery such as combines or all-terrain vehicles (ATV) can have considerable cumulative exposure to whole body vibration (WBV). Although there appears to be an association between LBD and WBV, lack of prospective studies makes the relationship between LBD and WBV unclear. Purpose: This study investigates the association between WBV and LBD among Saskatchewan farmers using a prospective cohort study Methods: The Saskatchewan Farm Injury Cohort Study Phase I (2007) and II (2013) data were used. Baseline data were collected via postal questionnaire on accumulated yearly tractor, combine, and ATV use as well as several covariates to support a biopsychosocial model of LBD. Follow-up data on musculoskeletal symptoms were collected for the 6-year with sample size of 1149. Questions on ‘low back trouble’ (ache, pain, discomfort) experienced in the last 12 months answered by farmer participants as ‘yes’ or ‘no’. A GEE-modified Poisson approach was performed using SPSS 22 and SAS 9.4. Results: Twelve-month Prevalence of LBD was 59.8%. In multivariate analysis of the 6-year follow-up, LBD was associated with ATV operation and tractor operation, with a dose-response relationship for annual accumulated tractor operation. Although combine operation ≥ 61 hrs/year was related to LBD in bivariate analysis, this difference did not persist after adjustment for confounder. Age was found to be a confounder in relationship between WBV and LBD and no interactions were found. Conclusion: Longer annual tractor operation and older age are important predictors of LBD symptoms in farmers. Future research involving direct measurement can help identify appropriate prevention strategies.

Keywords: agriculture, low back disorder, low back pain, occupational health

Procedia PDF Downloads 327
15440 Big Classes, Bigger Ambitions: A Participatory Approach to the Multiple-Choice Exam

Authors: Melanie Adrian, Elspeth McCulloch, Emily-Jean Gallant

Abstract:

Resources -financial, physical, and human- are increasingly constrained in higher education. University classes are getting bigger, and the concomitant grading burden on faculty is growing rapidly. Multiple-choice exams are seen by some as one solution to these changes. How much students retain, however, and what their testing experience is, continues to be debated. Are multiple-choice exams serving students well, or are they bearing the burden of these developments? Is there a way to address both the resource constraints and make these types of exams more meaningful? In short, how do we engender evaluation methods for large-scale classes that provide opportunities for heightened student learning and enrichment? The following article lays out a testing approach we have employed in four iterations of the same third-year law class. We base our comments in this paper on our initial observations as well as data gathered from an ethics-approved study looking at student experiences. This testing approach provides students with multiple opportunities for revision (thus increasing chances for long term retention), is both individually and collaboratively driven (thus reflecting the individual effort and group effort) and is automatically graded (thus draining limited institutional resources). We found that overall students appreciated the approach and found it more ‘humane’, that it notably reduced pre-exam and intra-exam stress levels, increased ease, and lowered nervousness.

Keywords: exam, higher education, multiple-choice, law

Procedia PDF Downloads 128
15439 25 Years of the Neurolinguistic Approach: Origin, Outcomes, Expansion and Current Experiments

Authors: Steeve Mercier, Joan Netten, Olivier Massé

Abstract:

The traditional lack of success of most Canadian students in the regular French program in attaining the ability to communicate spontaneously led to the conceptualization of a modified program. This program, called Intensive French, introduced and evaluated as an experiment in several school districts, formed the basis for the creation of a more effective approach for the development of skills in a second/foreign language and literacy: the Neurolinguistic Approach (NLA).The NLA expresses the major change in the understanding of how communication skills are developed: learning to communicate spontaneously in a second language depends on the reuse of structures in a variety of cognitive situations to express authentic messages rather than on knowledge of the way a language functions. Put differently, it prioritises the acquisition of implicit competence over the learning of grammatical knowledge. This is achieved by the adoption of a literacy-based approach and an increase in intensity of instruction.Besides having strong support empirically from numerous experiments, the NLA has sound theoretical foundation, as it conforms to research in neurolinguistics. The five pedagogical principles that define the approach will be explained, as well as the differences between the NLA and the paradigm on which most current resources and teaching strategies are based. It is now 25 years since the original research occurred. The use of the NLA, as it will be shown, has expanded widely. With some adaptations, it is used for other languages and in other milieus. In Canada, classes are offered in mandarin, Ukrainian, Spanish and Arabic, amongst others. It has also been used in several indigenous communities, such as to restore the use of Mohawk, Cri and Dene. Its use has expanded throughout the world, as in China, Japan, France, Germany, Belgium, Poland, Russia, as well as Mexico. The Intensive French program originally focussed on students in grades 5 or 6 (ages 10 -12); nowadays, the programs based on the approach include adults, particularly immigrants entering new countries. With the increasing interest in inclusion and cultural diversity, there is a demand for language learning amongst pre-school and primary children that can be successfully addressed by the NLA. Other current experiments target trilingual schools and work with Inuit communities of Nunavik in the province of Quebec.

Keywords: neuroeducation, neurolinguistic approach, literacy, second language acquisition, plurilingualism, foreign language teaching and learning

Procedia PDF Downloads 73
15438 Whether Chaos Theory Could Reconstruct the Ancient Societies

Authors: Zahra Kouzehgari

Abstract:

Since the early emergence of chaos theory in the 1970s in mathematics and physical science, it has increasingly been developed and adapted in social sciences as well. The non-linear and dynamic characteristics of the theory make it a useful conceptual framework to interpret the complex social systems behavior. Regarding chaotic approach principals, sensitivity to initial conditions, dynamic adoption, strange attractors and unpredictability this paper aims to examine whether chaos approach could interpret the ancient social changes. To do this, at first, a brief history of the chaos theory, its development and application in social science as well as the principals making the theory, then its application in archaeological since has been reviewed. The study demonstrates that although based on existing archaeological records reconstruct the whole social system of the human past, the non-linear approaches in studying social complex systems would be of a great help in finding general order of the ancient societies and would enable us to shed light on some of the social phenomena in the human history or to make sense of them.

Keywords: archaeology, non-linear approach, chaos theory, ancient social systems

Procedia PDF Downloads 286
15437 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes

Authors: Sky Chou, Joseph C. Chen

Abstract:

This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.

Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology

Procedia PDF Downloads 243
15436 Optimization of the Numerical Fracture Mechanics

Authors: H. Hentati, R. Abdelmoula, Li Jia, A. Maalej

Abstract:

In this work, we present numerical simulations of the quasi-static crack propagation based on the variation approach. We perform numerical simulations of a piece of brittle material without initial crack. An alternate minimization algorithm is used. Based on these numerical results, we determine the influence of numerical parameters on the location of crack. We show the importance of trying to optimize the time of numerical computation and we present the first attempt to develop a simple numerical method to optimize this time.

Keywords: fracture mechanics, optimization, variation approach, mechanic

Procedia PDF Downloads 607
15435 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development

Procedia PDF Downloads 244
15434 Proportional and Integral Controller-Based Direct Current Servo Motor Speed Characterization

Authors: Adel Salem Bahakeem, Ahmad Jamal, Mir Md. Maruf Morshed, Elwaleed Awad Khidir

Abstract:

Direct Current (DC) servo motors, or simply DC motors, play an important role in many industrial applications such as manufacturing of plastics, precise positioning of the equipment, and operating computer-controlled systems where speed of feed control, maintaining the position, and ensuring to have a constantly desired output is very critical. These parameters can be controlled with the help of control systems such as the Proportional Integral Derivative (PID) controller. The aim of the current work is to investigate the effects of Proportional (P) and Integral (I) controllers on the steady state and transient response of the DC motor. The controller gains are varied to observe their effects on the error, damping, and stability of the steady and transient motor response. The current investigation is conducted experimentally on a servo trainer CE 110 using analog PI controller CE 120 and theoretically using Simulink in MATLAB. Both experimental and theoretical work involves varying integral controller gain to obtain the response to a steady-state input, varying, individually, the proportional and integral controller gains to obtain the response to a step input function at a certain frequency, and theoretically obtaining the proportional and integral controller gains for desired values of damping ratio and response frequency. Results reveal that a proportional controller helps reduce the steady-state and transient error between the input signal and output response and makes the system more stable. In addition, it also speeds up the response of the system. On the other hand, the integral controller eliminates the error but tends to make the system unstable with induced oscillations and slow response to eliminate the error. From the current work, it is desired to achieve a stable response of the servo motor in terms of its angular velocity subjected to steady-state and transient input signals by utilizing the strengths of both P and I controllers.

Keywords: DC servo motor, proportional controller, integral controller, controller gain optimization, Simulink

Procedia PDF Downloads 110
15433 Solutions for Food-Safe 3D Printing

Authors: Geremew Geidare Kailo, Igor Gáspár, András Koris, Ivana Pajčin, Flóra Vitális, Vanja Vlajkov

Abstract:

Three-dimension (3D) printing, a very popular additive manufacturing technology, has recently undergone rapid growth and replaced the use of conventional technology from prototyping to producing end-user parts and products. The 3D Printing technology involves a digital manufacturing machine that produces three-dimensional objects according to designs created by the user via 3D modeling or computer-aided design/manufacturing (CAD/CAM) software. The most popular 3D printing system is Fused Deposition Modeling (FDM) or also called Fused Filament Fabrication (FFF). A 3D-printed object is considered food safe if it can have direct contact with the food without any toxic effects, even after cleaning, storing, and reusing the object. This work analyzes the processing timeline of the filament (material for 3D printing) from unboxing to the extrusion through the nozzle. It is an important task to analyze the growth of bacteria on the 3D printed surface and in gaps between the layers. By default, the 3D-printed object is not food safe after longer usage and direct contact with food (even though they use food-safe filaments), but there are solutions for this problem. The aim of this work was to evaluate the 3D-printed object from different perspectives of food safety. Firstly, testing antimicrobial 3D printing filaments from a food safety aspect since the 3D Printed object in the food industry may have direct contact with the food. Therefore, the main purpose of the work is to reduce the microbial load on the surface of a 3D-printed part. Coating with epoxy resin was investigated, too, to see its effect on mechanical strength, thermal resistance, surface smoothness and food safety (cleanability). Another aim of this study was to test new temperature-resistant filaments and the effect of high temperature on 3D printed materials to see if they can be cleaned with boiling or similar hi-temp treatment. This work proved that all three mentioned methods could improve the food safety of the 3D printed object, but the size of this effect variates. The best result we got was with coating with epoxy resin, and the object was cleanable like any other injection molded plastic object with a smooth surface. Very good results we got by boiling the objects, and it is good to see that nowadays, more and more special filaments have a food-safe certificate and can withstand boiling temperatures too. Using antibacterial filaments reduced bacterial colonies to 1/5, but the biggest advantage of this method is that it doesn’t require any post-processing. The object is ready out of the 3D printer. Acknowledgements: The research was supported by the Hungarian and Serbian bilateral scientific and technological cooperation project funded by the Hungarian National Office for Research, Development and Innovation (NKFI, 2019-2.1.11-TÉT-2020-00249) and the Ministry of Education, Science and Technological Development of the Republic of Serbia. The authors acknowledge the Hungarian University of Agriculture and Life Sciences’s Doctoral School of Food Science for the support in this study

Keywords: food safety, 3D printing, filaments, microbial, temperature

Procedia PDF Downloads 143
15432 Executive Functions Directly Associated with Severity of Perceived Pain above and beyond Depression in the Context of Medical Rehabilitation

Authors: O. Elkana, O Heyman, S. Hamdan, M. Franko, J. Vatine

Abstract:

Objective: To investigate whether a direct link exists between perceived pain (PP) and executive functions (EF), above and beyond the influence of depression symptoms, in the context of medical rehabilitation. Design: Cross-sectional study. Setting: Rehabilitation Hospital. Participants: 125 medical records of hospitalized patients were screened for matching to our inclusion criteria. Only 60 patients were found fit and were asked to participate. 19 decline to participate on personal basis. The 41 neurologically intact patients (mean age 46, SD 14.96) that participated in this study were in their sub-acute stage of recovery, with fluent Hebrew, with intact upper limb (to neutralize influence on psychomotor performances) and without an organic brain damage. Main Outcome Measures: EF were assessed using the Wisconsin Card Sorting Test (WCST) and the Stop-Signal Test (SST). PP was measured using 3 well-known pain questionnaires: Pain Disability Index (PDI), The Short-Form McGill Questionnaire (SF-MPQ) and the Pain Catastrophizing Scale (PCS). Perceived pain index (PPI) was calculated by the mean score composite from the 3 pain questionnaires. Depression symptoms were assessed using the Patient Health Questionnaire (PHQ-9). Results: The results indicate that irrespective of the presence of depression symptoms, PP is directly correlated with response inhibition (SST partial correlation: r=0.5; p=0.001) and mental flexibility (WSCT partial correlation: r=-0.37; p=0.021), suggesting decreased performance in EF as PP severity increases. High correlations were found between the 3 pain measurements: SF-MPQ with PDI (r=0.62, p<0.001), SF-MPQ with PCS (r=0.58, p<0.001) and PDI with PCS (r=0.38, p=0.016) and each questionnaire alone was also significantly associated with EF; thus, no specific questionnaires ‘pulled’ the results obtained by the general index (PPI). Conclusion: Examining the direct association between PP and EF, beyond the contribution of depression symptoms, provides further clinical evidence suggesting that EF and PP share underlying mediating neuronal mechanisms. Clinically, the importance of assessing patients' EF abilities as well as PP severity during rehabilitation is underscored.

Keywords: depression, executive functions, mental-flexibility, neuropsychology, pain perception, perceived pain, response inhibition

Procedia PDF Downloads 250
15431 Characteristic Function in Estimation of Probability Distribution Moments

Authors: Vladimir S. Timofeev

Abstract:

In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.

Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation

Procedia PDF Downloads 506
15430 The Observable Method for the Regularization of Shock-Interface Interactions

Authors: Teng Li, Kamran Mohseni

Abstract:

This paper presents an inviscid regularization technique that is capable of regularizing the shocks and sharp interfaces simultaneously in the shock-interface interaction simulations. The direct numerical simulation of flows involving shocks has been investigated for many years and a lot of numerical methods were developed to capture the shocks. However, most of these methods rely on the numerical dissipation to regularize the shocks. Moreover, in high Reynolds number flows, the nonlinear terms in hyperbolic Partial Differential Equations (PDE) dominates, constantly generating small scale features. This makes direct numerical simulation of shocks even harder. The same difficulty happens in two-phase flow with sharp interfaces where the nonlinear terms in the governing equations keep sharpening the interfaces to discontinuities. The main idea of the proposed technique is to average out the small scales that is below the resolution (observable scale) of the computational grid by filtering the convective velocity in the nonlinear terms in the governing PDE. This technique is named “observable method” and it results in a set of hyperbolic equations called observable equations, namely, observable Navier-Stokes or Euler equations. The observable method has been applied to the flow simulations involving shocks, turbulence, and two-phase flows, and the results are promising. In the current paper, the observable method is examined on the performance of regularizing shocks and interfaces at the same time in shock-interface interaction problems. Bubble-shock interactions and Richtmyer-Meshkov instability are particularly chosen to be studied. Observable Euler equations will be numerically solved with pseudo-spectral discretization in space and third order Total Variation Diminishing (TVD) Runge Kutta method in time. Results are presented and compared with existing publications. The interface acceleration and deformation and shock reflection are particularly examined.

Keywords: compressible flow simulation, inviscid regularization, Richtmyer-Meshkov instability, shock-bubble interactions.

Procedia PDF Downloads 349