Search results for: authoring tool
4083 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game
Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin
Abstract:
Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design
Procedia PDF Downloads 4224082 Psychometric Properties of the Social Skills Rating System: Teacher Version
Authors: Amani Kappi, Ana Maria Linares, Gia Mudd-Martin
Abstract:
Children with Attention Deficit Hyperactivity Disorder (ADHD) are more likely to develop social skills deficits that can lead to academic underachievement, peer rejection, and maladjustment. Surveying teachers about children's social skills with ADHD will become a significant factor in identifying whether the children will be diagnosed with social skills deficits. The teacher-specific version of the Social Skills Rating System scale (SSRS-T) has been used as a screening tool for children's social behaviors. The psychometric properties of the SSRS-T have been evaluated in various populations and settings, such as when used by teachers to assess social skills for children with learning disabilities. However, few studies have been conducted to examine the psychometric properties of the SSRS-T when used to assess children with ADHD. The purpose of this study was to examine the psychometric properties of the SSRS-T and two SSRS-T subscales, Social Skills and Problem Behaviors. This was a secondary analysis of longitudinal data from the Fragile Families and Child Well-Being Study. This study included a sample of 194 teachers who used the SSRS-T to assess the social skills of children aged 8 to 10 years with ADHD. Exploratory principal components factor analysis was used to assess the construct validity of the SSRS-T scale. Cronbach’s alpha value was used to assess the internal consistency reliability of the total SSRS-T scale and the subscales. Item analyses included item-item intercorrelations, item-to-subscale correlations, and Cronbach’s alpha value changes with item deletion. The results of internal consistency reliability for both the total scale and subscales were acceptable. The results of the exploratory factor analysis supported the five factors of SSRS-T (Cooperation, Self-control, Assertion, Internalize behaviors, and Externalize behaviors) reported in the original version. Findings indicated that SSRS-T is a reliable and valid tool for assessing the social behaviors of children with ADHD.Keywords: ADHD, children, social skills, SSRS-T, psychometric properties
Procedia PDF Downloads 1314081 Measurement and Monitoring of Graduate Attributes via iCGPA Implementation and ACADEMIA Programming: UNIMAS Case Study
Authors: Shanti Faridah Salleh, Azzahrah Anuar, Hamimah Ujir, Rohana Sapawi, Wan Hashim Wan Ibrahim, Noraziah Abdul Wahab, Majina Sulaiman, Raudhah Ahmadi, Al-Khalid Othman, Johari Abdullah
Abstract:
Integrated Cumulative Grade Point Average or iCGPA is an evaluation and reporting system that represents a comprehensive development of students’ achievement in their academic programs. Universiti Malaysia Sarawak, UNIMAS has started its implementation of iCGPA in 2016. iCGPA is driven by the Outcome-Based Education (OBE) system that has been long integrated into the higher education in Malaysia. iCGPA is not only a tool to enhance the OBE concept through constructive alignment but it is also an integrated mechanism to assist various stakeholders in making decisions or planning for program improvement. The outcome of this integrated system is the reporting of students’ academic performance in terms of cognitive (knowledge), psychomotor (skills), and affective (attitude) of which the students acquire throughout the duration of their study. The iCGPA reporting illustrates the attainment of student’s attribute in the eight domains of learning outcomes listed in the Malaysian Qualifications Framework (MQF). This paper discusses on the implementation of iCGPA in UNIMAS on the policy and strategy to direct the whole university to implement the iCGPA. The steps and challenges in integrating the exsting Outcome-Based Education and utilising iCGPA as a tool to quantify the students’ achievement are also highlighted in this paper. Finally, the ACADEMIA system, which is a dedicated centralised program ensure the implementation of iCGPA is a success has been developed. This paper discusses the structure and the analysis of ACADEMIA program and concludes the analysis made on the improvement made on the implementation of constructive alignment in all 40 programs involves in iCGPA implementation.Keywords: constructive alignment, holistic graduates, mapping of assessment, programme outcome
Procedia PDF Downloads 2094080 Understanding Student Engagement through Sentiment Analytics of Response Times to Electronically Shared Feedback
Authors: Yaxin Bi, Peter Nicholl
Abstract:
The rapid advancement of Information and communication technologies (ICT) is extremely influencing every aspect of Higher Education. It has transformed traditional teaching, learning, assessment and feedback into a new era of Digital Education. This also introduces many challenges in capturing and understanding student engagement with their studies in Higher Education. The School of Computing at Ulster University has developed a Feedback And Notification (FAN) Online tool that has been used to send students links to personalized feedback on their submitted assessments and record students’ frequency of review of the shared feedback as well as the speed of collection. The feedback that the students initially receive is via a personal email directing them through to the feedback via a URL link that maps to the feedback created by the academic marker. This feedback is typically a Word or PDF report including comments and the final mark for the work submitted approximately three weeks before. When the student clicks on the link, the student’s personal feedback is viewable in the browser and they can view the contents. The FAN tool provides the academic marker with a report that includes when and how often a student viewed the feedback via the link. This paper presents an investigation into student engagement through analyzing the interaction timestamps and frequency of review by the student. We have proposed an approach to modeling interaction timestamps and use sentiment classification techniques to analyze the data collected over the last five years for a set of modules. The data studied is across a number of final years and second-year modules in the School of Computing. The paper presents the details of quantitative analysis methods and describes further their interactions with the feedback overtime on each module studied. We have projected the students into different groups of engagement based on sentiment analysis results and then provide a suggestion of early targeted intervention for the set of students seen to be under-performing via our proposed model.Keywords: feedback, engagement, interaction modelling, sentiment analysis
Procedia PDF Downloads 1034079 Modeling of Timing in a Cyber Conflict to Inform Critical Infrastructure Defense
Authors: Brian Connett, Bryan O'Halloran
Abstract:
Systems assets within critical infrastructures were seemingly safe from the exploitation or attack by nefarious cyberspace actors. Now, critical infrastructure is a target and the resources to exploit the cyber physical systems exist. These resources are characterized in terms of patience, stealth, replication-ability and extraordinary robustness. System owners are obligated to maintain a high level of protection measures. The difficulty lies in knowing when to fortify a critical infrastructure against an impending attack. Models currently exist that demonstrate the value of knowing the attacker’s capabilities in the cyber realm and the strength of the target. The shortcomings of these models are that they are not designed to respond to the inherent fast timing of an attack, an impetus that can be derived based on open-source reporting, common knowledge of exploits of and the physical architecture of the infrastructure. A useful model will inform systems owners how to align infrastructure architecture in a manner that is responsive to the capability, willingness and timing of the attacker. This research group has used an existing theoretical model for estimating parameters, and through analysis, to develop a decision tool for would-be target owners. The continuation of the research develops further this model by estimating the variable parameters. Understanding these parameter estimations will uniquely position the decision maker to posture having revealed the vulnerabilities of an attacker’s, persistence and stealth. This research explores different approaches to improve on current attacker-defender models that focus on cyber threats. An existing foundational model takes the point of view of an attacker who must decide what cyber resource to use and when to use it to exploit a system vulnerability. It is valuable for estimating parameters for the model, and through analysis, develop a decision tool for would-be target owners.Keywords: critical infrastructure, cyber physical systems, modeling, exploitation
Procedia PDF Downloads 1924078 Virtual Reality as a Tool in Modern Education
Authors: Łukasz Bis
Abstract:
The author is going to discuss virtual reality and its importance for new didactic methods. It has been known for years that experience-based education gives much better results in terms of long-term memory than theoretical study. However, practice is expensive - virtual reality allows the use of an empirical approach to learning, with minimized production costs. The author defines what makes a given VR experience appropriate (adequate) for the didactic and cognitive process. The article is a kind of a list of guidelines and their importance for the VR experience under development.Keywords: virtual reality, education, universal design, guideline
Procedia PDF Downloads 1064077 BIM4Cult Leveraging BIM and IoT for Enhancing Fire Safety in Historical Buildings
Authors: Anastasios Manos, Despina Elisabeth Filippidou
Abstract:
Introduction: Historical buildings are an inte-gral part of the cultural heritage of every place, and beyond the obvious need for protection against risks, they have specific requirements regarding the handling of hazards and disasters such as fire, floods, earthquakes, etc. Ensuring high levels of protection and safety for these buildings is impera-tive for two distinct but interconnected reasons: a) they themselves constitute cultural heritage, and b) they are often used as museums/cultural spaces, necessitating the protection of both human life (vis-itors and workers) and the cultural treasures they house. However, these buildings present serious constraints in implementing the necessary measures to protect them from destruction due to their unique architecture, construction methods, and/or the structural materials used in the past, which have created an existing condition that is sometimes challenging to reshape and operate within the framework of modern regulations and protection measures. One of the most devastating risks that threaten historical buildings is fire. Catastrophic fires demonstrate the need for timely evaluation of fire safety measures in historical buildings. Recog-nizing the criticality of protecting historical build-ings from the risk of fire, the Confederation of Fire Protection Associations in Europe (CFPA E) issued specific guidelines in 2013 (CFPA-E Guideline No 30:2013 F) for the fire protection of historical buildings at the European level. However, until now, few actions have been implemented towards leveraging modern technologies in the field of con-struction and maintenance of buildings, such as Building Information Modeling (BIM) and the Inter-net of Things (IoT), for the protection of historical buildings from risks like fires, floods, etc. The pro-ject BIM4Cult has bee developed in order to fill this gap. It is a tool for timely assessing and monitoring of the fire safety level of historical buildings using BIM and IoT technologies in an integrated manner. The tool serves as a decision support expert system for improving the fire safety of historical buildings by continuously monitoring, controlling and as-sessing critical risk factors for fire.Keywords: Iot, fire, BIM, expert system
Procedia PDF Downloads 714076 A Prototype of an Information and Communication Technology Based Intervention Tool for Children with Dyslexia
Authors: Rajlakshmi Guha, Sajjad Ansari, Shazia Nasreen, Hirak Banerjee, Jiaul Paik
Abstract:
Dyslexia is a neurocognitive disorder, affecting around fifteen percent of the Indian population. The symptoms include difficulty in reading alphabet, words, and sentences. This can be difficult at the phonemic or recognition level and may further affect lexical structures. Therapeutic intervention of dyslexic children post assessment is generally done by special educators and psychologists through one on one interaction. Considering the large number of children affected and the scarcity of experts, access to care is limited in India. Moreover, unavailability of resources and timely communication with caregivers add on to the problem of proper intervention. With the development of Educational Technology and its use in India, access to information and care has been improved in such a large and diverse country. In this context, this paper proposes an ICT enabled home-based intervention program for dyslexic children which would support the child, and provide an interactive interface between expert, parents, and students. The paper discusses the details of the database design and system layout of the program. Along with, it also highlights the development of different technical aids required to build out personalized android applications for the Indian dyslexic population. These technical aids include speech database creation for children, automatic speech recognition system, serious game development, and color coded fonts. The paper also emphasizes the games developed to assist the dyslexic child on cognitive training primarily for attention, working memory, and spatial reasoning. In addition, it talks about the specific elements of the interactive intervention tool that makes it effective for home based intervention of dyslexia.Keywords: Android applications, cognitive training, dyslexia, intervention
Procedia PDF Downloads 2914075 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products
Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry
Abstract:
The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively
Procedia PDF Downloads 904074 Climate Change and Urban Flooding: The Need to Rethinking Urban Flood Management through Resilience
Authors: Suresh Hettiarachchi, Conrad Wasko, Ashish Sharma
Abstract:
The ever changing and expanding urban landscape increases the stress on urban systems to support and maintain safe and functional living spaces. Flooding presents one of the more serious threats to this safety, putting a larger number of people in harm’s way in congested urban settings. Climate change is adding to this stress by creating a dichotomy in the urban flood response. On the one hand, climate change is causing storms to intensify, resulting in more destructive, rarer floods, while on the other hand, longer dry periods are decreasing the severity of more frequent, less intense floods. This variability is creating a need to be more agile and innovative in how we design for and manage urban flooding. Here, we argue that to cope with this challenge climate change brings, we need to move towards urban flood management through resilience rather than flood prevention. We also argue that dealing with the larger variation in flood response to climate change means that we need to look at flooding from all aspects rather than the single-dimensional focus of flood depths and extents. In essence, we need to rethink how we manage flooding in the urban space. This change in our thought process and approach to flood management requires a practical way to assess and quantify resilience that is built into the urban landscape so that informed decision-making can support the required changes in planning and infrastructure design. Towards that end, we propose a Simple Urban Flood Resilience Index (SUFRI) based on a robust definition of resilience as a tool to assess flood resilience. The application of a simple resilience index such as the SUFRI can provide a practical tool that considers urban flood management in a multi-dimensional way and can present solutions that were not previously considered. When such an index is grounded on a clear and relevant definition of resilience, it can be a reliable and defensible way to assess and assist the process of adapting to the increasing challenges in urban flood management with climate change.Keywords: urban flood resilience, climate change, flood management, flood modelling
Procedia PDF Downloads 494073 Psychopathy Evaluation for People with Intellectual Disability Living in Institute Using Chinese Version of the Psychopathology Inventory
Authors: Lin Fu-Gong
Abstract:
Background: As WHO announced, people with intellectual disability (ID) were vulnerable to mental health problems. And there were few custom-made mental health scales for those people to monitor their mental health. Those people with mental problems often accompanied worse prognosis and usually became to be a heavier burden on the caregivers. Purpose: In this study, we intend to develop a psychopathy scale as a practical tool for monitoring the mental health for people with ID living in institute. Methods: In this study, we adopt the Psychopathology Inventory for Mentally Retarded Adults developed by professor Matson with certified reliability and validity in Western countries with Dr. Matson’s agreement in advance. We first translated the inventory into Chinese validated version considering the domestic culture background in the past year. And the validity and reliability evaluation of mental health status using this inventory among the people with intellectual living in the institute were done. Results: The inventory includes eight psychiatric disorder scales as schizophrenic, affective, psychosexual, adjustment, anxiety, somatoform, personality disorders and inappropriate mental adjustment. Around 83% of 40 invested people, who randomly selected from the institute, were found to have at least one disorder who were recommended with medical help by two evaluators. Among the residents examined, somatoform disorder and inappropriate mental adjustment were most popular with 60% and 78% people respectively. Conclusion: The result showed the prevalence psychiatric disorders were relatively high among people with ID in institute and the mental problems need to be further cared and followed for their mental health. The results showed that the psychopathology inventory was a useful tool for institute caregiver, manager and for long-term care policy to the government. In the coming stage, we plan to extend the use of the valid Chinese version inventory among more different type institutes for people with ID to establish their dynamic mental health status including medical need, relapse and rehabilitation to promote their mental health.Keywords: intellectual disability, psychiatric disorder, psychopathology inventory, mental health, the institute
Procedia PDF Downloads 2764072 Development of Map of Gridded Basin Flash Flood Potential Index: GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue Provinces
Authors: Le Xuan Cau
Abstract:
Flash flood is occurred in short time rainfall interval: from 1 hour to 12 hours in small and medium basins. Flash floods typically have two characteristics: large water flow and big flow velocity. Flash flood is occurred at hill valley site (strip of lowland of terrain) in a catchment with large enough distribution area, steep basin slope, and heavy rainfall. The risk of flash floods is determined through Gridded Basin Flash Flood Potential Index (GBFFPI). Flash Flood Potential Index (FFPI) is determined through terrain slope flash flood index, soil erosion flash flood index, land cover flash floods index, land use flash flood index, rainfall flash flood index. Determining GBFFPI, each cell in a map can be considered as outlet of a water accumulation basin. GBFFPI of the cell is determined as basin average value of FFPI of the corresponding water accumulation basin. Based on GIS, a tool is developed to compute GBFFPI using ArcObjects SDK for .NET. The maps of GBFFPI are built in two types: GBFFPI including rainfall flash flood index (real time flash flood warning) or GBFFPI excluding rainfall flash flood index. GBFFPI Tool can be used to determine a high flash flood potential site in a large region as quick as possible. The GBFFPI is improved from conventional FFPI. The advantage of GBFFPI is that GBFFPI is taking into account the basin response (interaction of cells) and determines more true flash flood site (strip of lowland of terrain) while conventional FFPI is taking into account single cell and does not consider the interaction between cells. The GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue is built and exported to Google Earth. The obtained map proves scientific basis of GBFFPI.Keywords: ArcObjects SDK for NET, basin average value of FFPI, gridded basin flash flood potential index, GBFFPI map
Procedia PDF Downloads 3814071 Passive Seismic in Hydrogeological Prospecting: The Case Study from Hard Rock and Alluvium Plain
Authors: Prarabdh Tiwari, M. Vidya Sagar, K. Bhima Raju, Joy Choudhury, Subash Chandra, E. Nagaiah, Shakeel Ahmed
Abstract:
Passive seismic, a wavefield interferometric imaging, low cost and rapid tool for subsurface investigation is used for various geotechnical purposes such as hydrocarbon exploration, seismic microzonation, etc. With the recent advancement, its application has also been extended to groundwater exploration by means of finding the bedrock depth. Council of Scientific & Industrial Research (CSIR)-National Geophysical Research Institute (NGRI) has experimented passive seismic studies along with electrical resistivity tomography for groundwater in hard rock (Choutuppal, Hyderabad). Passive Seismic with Electrical Resistivity (ERT) can give more clear 2-D subsurface image for Groundwater Exploration in Hard Rock area. Passive seismic data were collected using a Tromino, a three-component broadband seismometer, to measure background ambient noise and processed using GRILLA software. The passive seismic results are found corroborating with ERT (Electrical Resistivity Tomography) results. For data acquisition purpose, Tromino was kept over 30 locations consist recording of 20 minutes at each station. These location shows strong resonance frequency peak, suggesting good impedance contrast between different subsurface layers (ex. Mica rich Laminated layer, Weathered layer, granite, etc.) This paper presents signature of passive seismic for hard rock terrain. It has been found that passive seismic has potential application for formation characterization and can be used as an alternative tool for delineating litho-stratification in an urban condition where electrical and electromagnetic tools cannot be applied due to high cultural noise. In addition to its general application in combination with electrical and electromagnetic methods can improve the interpreted subsurface model.Keywords: passive seismic, resonant frequency, Tromino, GRILLA
Procedia PDF Downloads 1884070 Relearning to Learn: Approaching Sustainability by Incorporating Inuit Vernacular and Biomimicry Architecture Principles
Authors: Hakim Herbane
Abstract:
Efforts to achieve sustainability in architecture must prove their effectiveness despite various methods attempted. Biomimicry, which looks to successful natural models to promote sustainability and innovation, faces obstacles in implementing sustainability despite its restorative approach to the relationship between humans and nature. In Nunavik, Inuit communities are exploring a sustainable production system that aligns with their aspirations and meets their demands of human, technological, technical, economic, and ecological factors. Biomimicry holds promise in line with Inuit philosophy, but its failure to implement sustainability requires further investigations to remedy its deficiencies. Our literature review underscores the importance of involving the community in defining sustainability and determining the best methods for its implementation. Additionally, vernacular architecture shows valuable orientations for achieving sustainability. Moreover, reintegrating Inuit communities and their traditional architectural practices, which have successfully balanced their built environment's diverse needs and constraints, could pave the way for a sustainable Inuit-built environment in Nunavik and advance architectural biomimicry principles simultaneously. This research aims at establishing a sustainability monitoring tool for Nordic architectural process by analyzing Inuit vernacular and biomimetic architecture, in addition to the input of stakeholders involved in Inuit architecture production in Nunavik, especially Inuit. The goal is to create a practical tool (an index) to aid in designing sustainable architecture, taking into account environmental, social, and economic perspectives. Furthermore, the study seeks to authenticate strong, sustainable design principles of vernacular and biomimetic architectures. The literature review uncovered challenges and identified new opportunities. The forthcoming discourse will focus on the careful and considerate incorporation of Inuit communities’ perceptions and indigenous building practices into our methodology and the latest findings of our research.Keywords: sustainability, biomimicry, vernacular architecture, community involvement
Procedia PDF Downloads 544069 The Use of a Miniature Bioreactor as Research Tool for Biotechnology Process Development
Authors: Muhammad Zainuddin Arriafdi, Hamudah Hakimah Abdullah, Mohd Helmi Sani, Wan Azlina Ahmad, Muhd Nazrul Hisham Zainal Alam
Abstract:
The biotechnology process development demands numerous experimental works. In laboratory environment, this is typically carried out using a shake flask platform. This paper presents the design and fabrication of a miniature bioreactor system as an alternative research tool for bioprocessing. The working volume of the reactor is 100 ml, and it is made of plastic. The main features of the reactor included stirring control, temperature control via the electrical heater, aeration strategy through a miniature air compressor, and online optical cell density (OD) sensing. All sensors and actuators integrated into the reactor was controlled using an Arduino microcontroller platform. In order to demonstrate the functionality of such miniature bioreactor concept, series of batch Saccharomyces cerevisiae fermentation experiments were performed under various glucose concentrations. Results attained from the fermentation experiments were utilized to solve the Monod equation constants, namely the saturation constant, Ks, and cells maximum growth rate, μmax as to further highlight the usefulness of the device. The mixing capacity of the reactor was also evaluated. It was found that the results attained from the miniature bioreactor prototype were comparable to results achieved using a shake flask. The unique features of the device as compared to shake flask platform is that the reactor mixing condition is much more comparable to a lab-scale bioreactor setup. The prototype is also integrated with an online OD sensor, and as such, no sampling was needed to monitor the progress of the reaction performed. Operating cost and medium consumption are also low and thus, making it much more economical to be utilized for biotechnology process development compared to lab-scale bioreactors.Keywords: biotechnology, miniature bioreactor, research tools, Saccharomyces cerevisiae
Procedia PDF Downloads 1174068 The Significance of Translating Folklore in Teaching and Learning Open Distance e-Learning
Authors: M. A. Mabasa, O. Ramokolo, M. Z. Mnikathi, D. Mathabatha, T. Manyapelo
Abstract:
The study examines the importance of translating South African folklore from Oral into Written Literature in a Multilingual Education. Therefore, the study postulates that translation can be regarded as a valuable tool when oral and written literature is transmitted from one generation to another. The study entails that translation does not take place in a haphazard fashion; for that reason, skills such as translation principles are required to translate folklore significantly and effectively. The purpose of the study is to indicate the significance of using translation relating to folklore in teaching and learning. The study also observed that Modernism in literature should be shared amongst varieties of cultures because folklore is interactive in narrating stories, folktales and myths to sharpen the reader’s knowledge and intellect because they are informative and educative in nature. As a technological tool, the study points out that translation is of paramount importance in the sense that the meanings of different data can be made available in all South African official languages using oral and written forms of folklore. The study opines that tradition and customary beliefs and practices in the institution of higher learning. The study envisages the way in which literature of folklore can be juxtaposed to ensure that translated folklore is of quality assured standards. The study alludes that well-translated folklore can serve as oral and written literature, which may contribute to the child’s learning and acquisition of knowledge and insights during cognitive development toward maturity. Methodologically, the study selects a qualitative research approach and selects content analysis as an instrument for data gathering, which will be analyzed qualitatively in consideration of the significance of translating folklore as written and spoken literature in a documented way. The study reveals that the translation of folktales promotes functional multilingualism in high-function formal contexts like a university. The study emphasizes that translated and preserved literary folklore may serve as a language repository from one generation to another because of the archival and storage of information in the form of a term bank.Keywords: translation, editing, teaching, learning, folklores
Procedia PDF Downloads 334067 Indeterminacy: An Urban Design Tool to Measure Resilience to Climate Change, a Caribbean Case Study
Authors: Tapan Kumar Dhar
Abstract:
How well are our city forms designed to adapt to climate change and its resulting uncertainty? What urban design tools can be used to measure and improve resilience to climate change, and how would they do so? In addressing these questions, this paper considers indeterminacy, a concept originated in the resilience literature, to measure the resilience of built environments. In the realm of urban design, ‘indeterminacy’ can be referred to as built-in design capabilities of an urban system to serve different purposes which are not necessarily predetermined. An urban system, particularly that with a higher degree of indeterminacy, can enable the system to be reorganized and changed to accommodate new or unknown functions while coping with uncertainty over time. Underlying principles of this concept have long been discussed in the urban design and planning literature, including open architecture, landscape urbanism, and flexible housing. This paper argues that the concept indeterminacy holds the potential to reduce the impacts of climate change incrementally and proactively. With regard to sustainable development, both planning and climate change literature highly recommend proactive adaptation as it involves less cost, efforts, and energy than last-minute emergency or reactive actions. Nevertheless, the concept still remains isolated from resilience and climate change adaptation discourses even though the discourses advocate the incremental transformation of a system to cope with climatic uncertainty. This paper considers indeterminacy, as an urban design tool, to measure and increase resilience (and adaptive capacity) of Long Bay’s coastal settlements in Negril, Jamaica. Negril is one of the popular tourism destinations in the Caribbean highly vulnerable to sea-level rise and its associated impacts. This paper employs empirical information obtained from direct observation and informal interviews with local people. While testing the tool, this paper deploys an urban morphology study, which includes land use patterns and the physical characteristics of urban form, including street networks, block patterns, and building footprints. The results reveal that most resorts in Long Bay are designed for pre-determined purposes and offer a little potential to use differently if needed. Additionally, Negril’s street networks are found to be rigid and have limited accessibility to different points of interest. This rigidity can expose the entire infrastructure further to extreme climatic events and also impedes recovery actions after a disaster. However, Long Bay still has room for future resilient developments in other relatively less vulnerable areas. In adapting to climate change, indeterminacy can be reached through design that achieves a balance between the degree of vulnerability and the degree of indeterminacy: the more vulnerable a place is, the more indeterminacy is useful. This paper concludes with a set of urban design typologies to increase the resilience of coastal settlements.Keywords: climate change adaptation, resilience, sea-level rise, urban form
Procedia PDF Downloads 3664066 Use of Numerical Tools Dedicated to Fire Safety Engineering for the Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study shows the opportunity to use numerical tools dedicated to Fire Safety Engineering for the Rolling Stock. Indeed, some lawful requirements can now be demonstrated by using numerical tools. The first part of this study presents the use of modelling evacuation tool to satisfy the criteria of evacuation time for the rolling stock. The buildingEXODUS software is used to model and simulate the evacuation of rolling stock. Firstly, in order to demonstrate the reliability of this tool to calculate the complete evacuation time, a comparative study was achieved between a real test and simulations done with buildingEXODUS. Multiple simulations are performed to capture the stochastic variations in egress times. Then, a new study is done to calculate the complete evacuation time of a train with the same geometry but with a different interior architecture. The second part of this study shows some applications of Computational Fluid Dynamics. This work presents the approach of a multi scales validation of numerical simulations of standardized tests with Fire Dynamics Simulations software developed by the National Institute of Standards and Technology (NIST). This work highlights in first the cone calorimeter test, described in the standard ISO 5660, in order to characterize the fire reaction of materials. The aim of this process is to readjust measurement results from the cone calorimeter test in order to create a data set usable at the seat scale. In the second step, the modelisation concerns the fire seat test described in the standard EN 45545-2. The data set obtained thanks to the validation of the cone calorimeter test was set up in the fire seat test. To conclude with the third step, after controlled the data obtained for the seat from the cone calorimeter test, a larger scale simulation with a real part of train is achieved.Keywords: fire safety engineering, numerical tools, rolling stock, multi-scales validation
Procedia PDF Downloads 3034065 Optimized Passive Heating for Multifamily Dwellings
Authors: Joseph Bostick
Abstract:
A method of decreasing the heating load of HVAC systems in a single-dwelling model of a multifamily building, by controlling movable insulation through the optimization of flux, time, surface incident solar radiation, and temperature thresholds. Simulations are completed using a co-simulation between EnergyPlus and MATLAB as an optimization tool to find optimal control thresholds. Optimization of the control thresholds leads to a significant decrease in total heating energy expenditure.Keywords: energy plus, MATLAB, simulation, energy efficiency
Procedia PDF Downloads 1744064 Response of Lepidium Sativum to Ionic Toxicity
Authors: M. F. El-Barghathi, R. El-Tajouri
Abstract:
The effect of different concentrations of cadmium sulfate "CdSO4" (0.0, 10, 50, 100, 500 ppm) was tested on seed germination, seedling elongation and growth of Lepidium sativum (garden cress) plants. Results indicated that seed germination and seedling elongation were not inhibited by different concentrations of CdSO4. This could suggest that, Lepidium sativum may be used as a phyto remediation tool of soils contaminated with cadmium.Keywords: Lepidium sativum, heavy metals, ionic toxicity, phytoremediation
Procedia PDF Downloads 5564063 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations
Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li
Abstract:
The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling
Procedia PDF Downloads 1554062 Cost Based Analysis of Risk Stratification Tool for Prediction and Management of High Risk Choledocholithiasis Patients
Authors: Shreya Saxena
Abstract:
Background: Choledocholithiasis is a common complication of gallstone disease. Risk scoring systems exist to guide the need for further imaging or endoscopy in managing choledocholithiasis. We completed an audit to review the American Society for Gastrointestinal Endoscopy (ASGE) scoring system for prediction and management of choledocholithiasis against the current practice at a tertiary hospital to assess its utility in resource optimisation. We have now conducted a cost focused sub-analysis on patients categorized high-risk for choledocholithiasis according to the guidelines to determine any associated cost benefits. Method: Data collection from our prior audit was used to retrospectively identify thirteen patients considered high-risk for choledocholithiasis. Their ongoing management was mapped against the guidelines. Individual costs for the key investigations were obtained from our hospital financial data. Total cost for the different management pathways identified in clinical practice were calculated and compared against predicted costs associated with recommendations in the guidelines. We excluded the cost of laparoscopic cholecystectomy and considered a set figure for per day hospital admission related expenses. Results: Based on our previous audit data, we identified a77% positive predictive value for the ASGE risk stratification tool to determine patients at high-risk of choledocholithiasis. 47% (6/13) had an magnetic resonance cholangiopancreatography (MRCP) prior to endoscopic retrograde cholangiopancreatography (ERCP), whilst 53% (7/13) went straight for ERCP. The average length of stay in the hospital was 7 days, with an additional day and cost of £328.00 (£117 for ERCP) for patients awaiting an MRCP prior to ERCP. Per day hospital admission was valued at £838.69. When calculating total cost, we assumed all patients had admission bloods and ultrasound done as the gold standard. In doing an MRCP prior to ERCP, there was a 130% increase in cost incurred (£580.04 vs £252.04) per patient. When also considering hospital admission and the average length of stay, it was an additional £1166.69 per patient. We then calculated the exact costs incurred by the department, over a three-month period, for all patients, for key investigations or procedures done in the management of choledocholithiasis. This was compared to an estimate cost derived from the recommended pathways in the ASGE guidelines. Overall, 81% (£2048.45) saving was associated with following the guidelines compared to clinical practice. Conclusion: MRCP is the most expensive test associated with the diagnosis and management of choledocholithiasis. The ASGE guidelines recommend endoscopy without an MRCP in patients stratified as high-risk for choledocholithiasis. Our audit that focused on assessing the utility of the ASGE risk scoring system showed it to be relatively reliable for identifying high-risk patients. Our cost analysis has shown significant cost savings per patient and when considering the average length of stay associated with direct endoscopy rather than an additional MRCP. Part of this is also because of an increased average length of stay associated with waiting for an MRCP. The above data supports the ASGE guidelines for the management of high-risk for choledocholithiasis patients from a cost perspective. The only caveat is our small data set that may impact the validity of our average length of hospital stay figures and hence total cost calculations.Keywords: cost-analysis, choledocholithiasis, risk stratification tool, general surgery
Procedia PDF Downloads 984061 Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification
Authors: Fathi Kallel, Abdulelah Alabd Uljabbar, Abdulrahman Aldukhail, Abdulaziz Alomran
Abstract:
The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface.Keywords: MRI, brain tumor, CAD, feature extraction, DWT, PCA, classification, SVM
Procedia PDF Downloads 2504060 Using Derivative Free Method to Improve the Error Estimation of Numerical Quadrature
Authors: Chin-Yun Chen
Abstract:
Numerical integration is an essential tool for deriving different physical quantities in engineering and science. The effectiveness of a numerical integrator depends on different factors, where the crucial one is the error estimation. This work presents an error estimator that combines a derivative free method to improve the performance of verified numerical quadrature.Keywords: numerical quadrature, error estimation, derivative free method, interval computation
Procedia PDF Downloads 4634059 Reliability and Maintainability Optimization for Aircraft’s Repairable Components Based on Cost Modeling Approach
Authors: Adel A. Ghobbar
Abstract:
The airline industry is continuously challenging how to safely increase the service life of the aircraft with limited maintenance budgets. Operators are looking for the most qualified maintenance providers of aircraft components, offering the finest customer service. Component owner and maintenance provider is offering an Abacus agreement (Aircraft Component Leasing) to increase the efficiency and productivity of the customer service. To increase the customer service, the current focus on No Fault Found (NFF) units must change into the focus on Early Failure (EF) units. Since the effect of EF units has a significant impact on customer satisfaction, this needs to increase the reliability of EF units at minimal cost, which leads to the goal of this paper. By identifying the reliability of early failure (EF) units with regards to No Fault Found (NFF) units, in particular, the root cause analysis with an integrated cost analysis of EF units with the use of a failure mode analysis tool and a cost model, there will be a set of EF maintenance improvements. The data used for the investigation of the EF units will be obtained from the Pentagon system, an Enterprise Resource Planning (ERP) system used by Fokker Services. The Pentagon system monitors components, which needs to be repaired from Fokker aircraft owners, Abacus exchange pool, and commercial customers. The data will be selected on several criteria’s: time span, failure rate, and cost driver. When the selected data has been acquired, the failure mode and root cause analysis of EF units are initiated. The failure analysis approach tool was implemented, resulting in the proposed failure solution of EF. This will lead to specific EF maintenance improvements, which can be set-up to decrease the EF units and, as a result of this, increasing the reliability. The investigated EFs, between the time period over ten years, showed to have a significant reliability impact of 32% on the total of 23339 unscheduled failures. Since the EFs encloses almost one-third of the entire population.Keywords: supportability, no fault found, FMEA, early failure, availability, operational reliability, predictive model
Procedia PDF Downloads 1274058 Empowering Indigenous Epistemologies in Geothermal Development
Authors: Te Kīpa Kēpa B. Morgan, Oliver W. Mcmillan, Dylan N. Taute, Tumanako N. Fa'aui
Abstract:
Epistemologies are ways of knowing. Indigenous Peoples are aware that they do not perceive and experience the world in the same way as others. So it is important when empowering Indigenous epistemologies, such as that of the New Zealand Māori, to also be able to represent a scientific understanding within the same analysis. A geothermal development assessment tool has been developed by adapting the Mauri Model Decision Making Framework. Mauri is a metric that is capable of representing the change in the life-supporting capacity of things and collections of things. The Mauri Model is a method of grouping mauri indicators as dimension averages in order to allow holistic assessment and also to conduct sensitivity analyses for the effect of worldview bias. R-shiny is the coding platform used for this Vision Mātauranga research which has created an expert decision support tool (DST) that combines a stakeholder assessment of worldview bias with an impact assessment of mauri-based indicators to determine the sustainability of proposed geothermal development. The initial intention was to develop guidelines for quantifying mātauranga Māori impacts related to geothermal resources. To do this, three typical scenarios were considered: a resource owner wishing to assess the potential for new geothermal development; another party wishing to assess the environmental and cultural impacts of the proposed development; an assessment that focuses on the holistic sustainability of the resource, including its surface features. Indicator sets and measurement thresholds were developed that are considered necessary considerations for each assessment context and these have been grouped to represent four mauri dimensions that mirror the four well-being criteria used for resource management in Aotearoa, New Zealand. Two case studies have been conducted to test the DST suitability for quantifying mātauranga Māori and other biophysical factors related to a geothermal system. This involved estimating mauri0meter values for physical features such as temperature, flow rate, frequency, colour, and developing indicators to also quantify qualitative observations about the geothermal system made by Māori. A retrospective analysis has then been conducted to verify different understandings of the geothermal system. The case studies found that the expert DST is useful for geothermal development assessment, especially where hapū (indigenous sub-tribal grouping) are conflicted regarding the benefits and disadvantages of their’ and others’ geothermal developments. These results have been supplemented with evaluations for the cumulative impacts of geothermal developments experienced by different parties using integration techniques applied to the time history curve of the expert DST worldview bias weighted plotted against the mauri0meter score. Cumulative impacts represent the change in resilience or potential of geothermal systems, which directly assists with the holistic interpretation of change from an Indigenous Peoples’ perspective.Keywords: decision support tool, holistic geothermal assessment, indigenous knowledge, mauri model decision-making framework
Procedia PDF Downloads 1874057 Positive Impact of Cartoon Movies on Adults
Authors: Yacoub Aljaffery
Abstract:
As much as we think negatively about social media such as TV and smart phones, there are many positive benefits our society can get from it. Cartoons, for example, are made specifically for children. However, in this paper, we will prove how cartoon videos can have a positive impact on adults, especially college students. Since cartoons are meant to be a good learning tool for children, as well as adults, we will show our audience how they can use cartoon in teaching critical thinking and other language skills.Keywords: social media, TV, teaching, learning, cartoon movies
Procedia PDF Downloads 3244056 The Reasons behind Individuals to Join Terrorist Organizations: Recruitment from Outside
Authors: Murat Sözen
Abstract:
Today terrorism is gaining momentum again. Parallel to this, it hurts more than before because it has victims from not only its own locations but also remote places. As victims are from outside, militants are likewise from own location and outside. What made these individuals join the terrorist organizations and how these organizations recruit militants are still unanswered. The purpose of this work is to find reasons of joining and power of recruiting. In addition, the role of most popular tool of recruiting, ‘social media’ will be examined.Keywords: recruitment, social media, recruitment, militants
Procedia PDF Downloads 3494055 International E-Learning for Assuring Ergonomic Working Conditions of Orthopaedic Surgeons: First Research Outcomes from Train4OrthoMIS
Authors: J. Bartnicka, J. A. Piedrabuena, R. Portilla, L. Moyano - Cuevas, J. B. Pagador, P. Augat, J. Tokarczyk, F. M. Sánchez Margallo
Abstract:
Orthopaedic surgeries are characterized by a high degree of complexity. This is reflected by four main groups of resources: 1) surgical team which is consisted of people with different competencies, educational backgrounds and positions; 2) information and knowledge about medical and technical aspects of surgery; 3) medical equipment including surgical tools and materials; 4) space infrastructure which is important from an operating room layout point of view. These all components must be integrated and build a homogeneous organism for achieving an efficient and ergonomically correct surgical workflow. Taking this as a background, there was formulated a concept of international project, called “Online Vocational Training course on ergonomics for orthopaedic Minimally Invasive” (Train4OrthoMIS), which aim is to develop an e-learning tool available in 4 languages (English, Spanish, Polish and German). In the article, there is presented the first project research outcomes focused on three aspects: 1) ergonomic needs of surgeons who work in hospitals around different European countries, 2) the concept of structure of e-learning course, 3) the definition of tools and methods for knowledge assessment adjusted to users’ expectation. The methodology was based on the expert panels and two types of surveys: 1) on training needs, 2) on evaluation and self-assessment preferences. The major findings of the study allowed describing the subjects of four training modules and learning sessions. According to peoples’ opinion there were defined most expected test methods which are single choice test and right after quizzes: “True or False” and “Link elements”. The first project outcomes confirmed the necessity of creating a universal training tool for orthopaedic surgeons regardless of the country in which they work. Because of limited time that surgeons have, the e-learning course should be strictly adjusted to their expectation in order to be useful.Keywords: international e-learning, ergonomics, orthopaedic surgery, Train4OrthoMIS
Procedia PDF Downloads 1804054 Measuring Fundamental Growth Needs in a Youth Boatbuilding Context
Authors: Shane Theunissen, Rob Grandy
Abstract:
Historically and we would fairly conventionally within our formal schooling systems, we have convergent testing where all the students are expected to converge on the same answer, and that answer has been determined by an external authority that is reproducing knowledge of the hegemon. Many youths may not embody the cultural capital that's rewarded in formal schooling contexts as they aren't able to converge on the required answer that's being determined by the classroom teacher or the administrators. In this paper, we explore divergent processes that promote creative problem-solving. We embody this divergent process in our measurement of fundamental growth needs. To this end, we utilize the Mosaic Approach as a method for implementing the Outcomes That Matter framework. Outcomes That Matter is the name of the measurement tool built around the Circle of Courage framework, which is a way of identifying fundamental growth needs for young people. The Circle of Courage was developed by Martin-Broken-Leg and colleagues as a way to connect indigenous child-rearing philosophies with contemporary resilience and positive psychology research. The Outcomes that Matter framework puts forward four categories of growth needs for young people. These are: Belonging, which on a macro scale is acceptance into the greater community of practice, Mastery which includes a constellation of concepts including confidence, motivation, self-actualization, and self-determination, Independence refers to a sense of personal power into autonomy within a context where creativity and problem solving, and a personal voice can begin to emerge, and finally Generosity which includes interpersonal things like conflict resolution and teamwork. Outcomes of Matter puts these four domains into a measurement tool that facilitates collaborative assessment between the youth, teachers, and recreation therapists that allows for youth-led narratives pertaining to their fundamental growth outcomes. This application of the Outcomes That Matter framework is unique as it may be the first application of this framework in an educational boatbuilding context.Keywords: collaboration, empowerment, outcomes that matter, mosaic approach, boat building
Procedia PDF Downloads 97