Search results for: collaborative learning approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18813

Search results for: collaborative learning approach

11673 The Quality of Food and Drink Product Labels Translation from Indonesian into English

Authors: Rudi Hartono, Bambang Purwanto

Abstract:

The translation quality of food and drink labels from Indonesian into English is poor because the translation is not accurate, less natural, and difficult to read. The label translation can be found in some cans packages of food and drink products produced and marketed by several companies in Indonesia. If this problem is left unchecked, it will lead to a misunderstanding on the translation results and make consumers confused. This study was conducted to analyze the translation errors on food and drink products labels and formulate the solution for the better translation quality. The research design was the evaluation research with a holistic criticism approach. The data used were words, phrases, and sentences translated from Indonesian to English language printed on food and drink product labels. The data were processed by using Interactive Model Analysis that carried out three main steps: collecting, classifying, and verifying data. Furthermore, the data were analyzed by using content analysis to view the accuracy, naturalness, and readability of translation. The results showed that the translation quality of food and drink product labels from Indonesian to English has the level of accuracy (60%), level of naturalness (50%), and level readability (60%). This fact needs a help to create an effective strategy for translating food and drink product labels later.

Keywords: translation quality, food and drink product labels, a holistic criticism approach, interactive model, content analysis

Procedia PDF Downloads 348
11672 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 159
11671 A Design Decision Framework for Net-Zero Carbon Buildings in Hot Climates: A Modeled Approach and Expert’s Feedback

Authors: Eric Ohene, Albert P. C. Chan, Shu-Chien HSU

Abstract:

The rising building energy consumption and related carbon emissions make it necessary to construct net-zero carbon buildings (NZCBs). The objective of net-zero buildings has raised the benchmark for building performance and will alter how buildings are designed and constructed. However, there have been growing concerns about uncertainty in net-zero building design and cost implications in decision-making. Lessons from practice have shown that a robust net-zero building design is complex, expensive, and time-consuming. Moreover, climate conditions have an enormous implication for choosing the best-optimal passive and active solutions to ensure building energy performance while ensuring the indoor comfort performance of occupants. It is observed that 20% of the design decisions made in the initial design phase influence 80% of all design decisions. To design and construct NZCBs, it is crucial to ensure adequate decision-making during the early design phases. Therefore, this study aims to explore practical strategies to design NZCBs and to offer a design framework that could help decision-making during the design stage of net-zero buildings. A parametric simulation approach was employed, and experts (i.e., architects, building designers) perspectives on the decision framework were solicited. The study could be helpful to building designers and architects to guide their decision-making during the design stage of NZCBs.

Keywords: net-zero, net-zero carbon building, energy efficiency, parametric simulation, hot climate

Procedia PDF Downloads 90
11670 Flood Planning Based on Risk Optimization: A Case Study in Phan-Calo River Basin in Vinh Phuc Province, Vietnam

Authors: Nguyen Quang Kim, Nguyen Thu Hien, Nguyen Thien Dung

Abstract:

Flood disasters are increasing worldwide in both frequency and magnitude. Every year in Vietnam, flood causes great damage to people, property, and environmental degradation. The flood risk management policy in Vietnam is currently updated. The planning of flood mitigation strategies is reviewed to make a decision how to reach sustainable flood risk reduction. This paper discusses the basic approach where the measures of flood protection are chosen based on minimizing the present value of expected monetary expenses, total residual risk and costs of flood control measures. This approach will be proposed and demonstrated in a case study for flood risk management in Vinh Phuc province of Vietnam. Research also proposed the framework to find a solution of optimal protection level and optimal measures of the flood. It provides an explicit economic basis for flood risk management plans and interactive effects of options for flood damage reduction. The results of the case study are demonstrated and discussed which would provide the processing of actions helped decision makers to choose flood risk reduction investment options.

Keywords: drainage plan, flood planning, flood risk, residual risk, risk optimization

Procedia PDF Downloads 215
11669 An Approach of Node Model TCnNet: Trellis Coded Nanonetworks on Graphene Composite Substrate

Authors: Diogo Ferreira Lima Filho, José Roberto Amazonas

Abstract:

Nanotechnology opens the door to new paradigms that introduces a variety of novel tools enabling a plethora of potential applications in the biomedical, industrial, environmental, and military fields. This work proposes an integrated node model by applying the same concepts of TCNet to networks of nanodevices where the nodes are cooperatively interconnected with a low-complexity Mealy Machine (MM) topology integrating in the same electronic system the modules necessary for independent operation in wireless sensor networks (WSNs), consisting of Rectennas (RF to DC power converters), Code Generators based on Finite State Machine (FSM) & Trellis Decoder and On-chip Transmit/Receive with autonomy in terms of energy sources applying the Energy Harvesting technique. This approach considers the use of a Graphene Composite Substrate (GCS) for the integrated electronic circuits meeting the following characteristics: mechanical flexibility, miniaturization, and optical transparency, besides being ecological. In addition, graphene consists of a layer of carbon atoms with the configuration of a honeycomb crystal lattice, which has attracted the attention of the scientific community due to its unique Electrical Characteristics.

Keywords: composite substrate, energy harvesting, finite state machine, graphene, nanotechnology, rectennas, wireless sensor networks

Procedia PDF Downloads 91
11668 DNA Nano Wires: A Charge Transfer Approach

Authors: S. Behnia, S. Fathizadeh, A. Akhshani

Abstract:

In the recent decades, DNA has increasingly interested in the potential technological applications that not directly related to the coding for functional proteins that is the expressed in form of genetic information. One of the most interesting applications of DNA is related to the construction of nanostructures of high complexity, design of functional nanostructures in nanoelectronical devices, nanosensors and nanocercuits. In this field, DNA is of fundamental interest to the development of DNA-based molecular technologies, as it possesses ideal structural and molecular recognition properties for use in self-assembling nanodevices with a definite molecular architecture. Also, the robust, one-dimensional flexible structure of DNA can be used to design electronic devices, serving as a wire, transistor switch, or rectifier depending on its electronic properties. In order to understand the mechanism of the charge transport along DNA sequences, numerous studies have been carried out. In this regard, conductivity properties of DNA molecule could be investigated in a simple, but chemically specific approach that is intimately related to the Su-Schrieffer-Heeger (SSH) model. In SSH model, the non-diagonal matrix element dependence on intersite displacements is considered. In this approach, the coupling between the charge and lattice deformation is along the helix. This model is a tight-binding linear nanoscale chain established to describe conductivity phenomena in doped polyethylene. It is based on the assumption of a classical harmonic interaction between sites, which is linearly coupled to a tight-binding Hamiltonian. In this work, the Hamiltonian and corresponding motion equations are nonlinear and have high sensitivity to initial conditions. Then, we have tried to move toward the nonlinear dynamics and phase space analysis. Nonlinear dynamics and chaos theory, regardless of any approximation, could open new horizons to understand the conductivity mechanism in DNA. For a detailed study, we have tried to study the current flowing in DNA and investigated the characteristic I-V diagram. As a result, It is shown that there are the (quasi-) ohmic areas in I-V diagram. On the other hand, the regions with a negative differential resistance (NDR) are detectable in diagram.

Keywords: DNA conductivity, Landauer resistance, negative di erential resistance, Chaos theory, mean Lyapunov exponent

Procedia PDF Downloads 409
11667 Production of Metal Powder Using Twin Arc Spraying Process for Additive Manufacturing

Authors: D. Chen, H. Daoud, C. Kreiner, U. Glatzel

Abstract:

Additive Manufacturing (AM) provides promising opportunities to optimize and to produce tooling by integrating near-contour tempering channels for more efficient cooling. To enhance the properties of the produced tooling using additive manufacturing, prototypes should be produced in short periods. Thereby, this requires a small amount of tailored powders, which either has a high production cost or is commercially unavailable. Hence, in this study, an arc spray atomization approach to produce a tailored metal powder at a lower cost and even in small quantities, in comparison to the conventional powder production methods, was proposed. This approach involves converting commercially available metal wire into powder by modifying the wire arc spraying process. The influences of spray medium and gas pressure on the powder properties were investigated. As a result, particles with smooth surface and lower porosity were obtained, when nonoxidizing gases are used for thermal spraying. The particle size decreased with increasing of the gas pressure, and the particles sizes are in the range from 10 to 70 µm, which is desirable for selective laser melting (SLM). A comparison of microstructure and mechanical behavior of SLM generated parts using arc sprayed powders (alloy: X5CrNiCuNb 16-4) and commercial powder (alloy: X5CrNiCuNb 16-4) was also conducted.

Keywords: additive manufacturing, arc spraying, powder production, selective laser melting

Procedia PDF Downloads 124
11666 Evaluation of Water Management Options to Improve the Crop Yield and Water Productivity for Semi-Arid Watershed in Southern India Using AquaCrop Model

Authors: V. S. Manivasagam, R. Nagarajan

Abstract:

Modeling the soil, water and crop growth interactions are attaining major importance, considering the future climate change and water availability for agriculture to meet the growing food demand. Progress in understanding the crop growth response during water stress period through crop modeling approach provides an opportunity for improving and sustaining the future agriculture water use efficiency. An attempt has been made to evaluate the potential use of crop modeling approach for assessing the minimal supplementary irrigation requirement for crop growth during water limited condition and its practical significance in sustainable improvement of crop yield and water productivity. Among the numerous crop models, water driven-AquaCrop model has been chosen for the present study considering the modeling approach and water stress impact on yield simulation. The study has been evaluated in rainfed maize grown area of semi-arid Shanmuganadi watershed (a tributary of the Cauvery river system) located in southern India during the rabi cropping season (October-February). In addition to actual rainfed maize growth simulation, irrigated maize scenarios were simulated for assessing the supplementary irrigation requirement during water shortage condition for the period 2012-2015. The simulation results for rainfed maize have shown that the average maize yield of 0.5-2 t ha-1 was observed during deficit monsoon season (<350 mm) whereas 5.3 t ha-1 was noticed during sufficient monsoonal period (>350 mm). Scenario results for irrigated maize simulation during deficit monsoonal period has revealed that 150-200 mm of supplementary irrigation has ensured the 5.8 t ha-1 of irrigated maize yield. Thus, study results clearly portrayed that minimal application of supplementary irrigation during the critical growth period along with the deficit rainfall has increased the crop water productivity from 1.07 to 2.59 kg m-3 for major soil types. Overall, AquaCrop is found to be very effective for the sustainable irrigation assessment considering the model simplicity and minimal inputs requirement.

Keywords: AquaCrop, crop modeling, rainfed maize, water stress

Procedia PDF Downloads 253
11665 A Soft Computing Approach Monitoring of Heavy Metals in Soil and Vegetables in the Republic of Macedonia

Authors: Vesna Karapetkovska Hristova, M. Ayaz Ahmad, Julijana Tomovska, Biljana Bogdanova Popov, Blagojce Najdovski

Abstract:

The average total concentrations of heavy metals; (cadmium [Cd], copper [Cu], nickel [Ni], lead [Pb], and zinc [Zn]) were analyzed in soil and vegetables samples collected from the different region of Macedonia during the years 2010-2012. Basic soil properties such as pH, organic matter and clay content were also included in the study. The average concentrations of Cd, Cu, Ni, Pb, Zn in the A horizon (0-30 cm) of agricultural soils were as follows, respectively: 0.25, 5.3, 6.9, 15.2, 26.3 mg kg-1 of soil. We have found that neural networking model can be considered as a tool for prediction and spatial analysis of the processes controlling the metal transfer within the soil-and vegetables. The predictive ability of such models is well over 80% as compared to 20% for typical regression models. A radial basic function network reflects good predicting accuracy and correlation coefficients between soil properties and metal content in vegetables much better than the back-propagation method. Neural Networking / soft computing can support the decision-making processes at different levels, including agro ecology, to improve crop management based on monitoring data and risk assessment of metal transfer from soils to vegetables.

Keywords: soft computing approach, total concentrations, heavy metals, agricultural soils

Procedia PDF Downloads 354
11664 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining

Procedia PDF Downloads 153
11663 Integrating Renewable Energy Forecasting Systems with HEMS and Developing It with a Bottom-Up Approach

Authors: Punit Gandhi, J. C. Brezet, Tim Gorter, Uchechi Obinna

Abstract:

This paper introduces how weather forecasting could help in more efficient energy management for smart homes with the use of Home Energy Management Systems (HEMS). The paper also focuses on educating consumers and helping them make more informed decisions while using the HEMS. A combined approach of technical and user perspective has been selected to develop a novel HEMS-product-service combination in a more comprehensive manner. The current HEMS switches on/off the energy intensive appliances based on the fluctuating electricity tariffs, but with weather forecasting, it is possible to shift the time of use of energy intensive appliances to maximum electricity production from the renewable energy system installed in the house. Also, it is possible to estimate the heating/cooling load of the house for the day ahead demand. Hence, relevant insight is gained in the expected energy production and consumption load for the next day, facilitating better (more efficient, peak shaved, cheaper, etc.) energy management practices for smart homes. In literature, on the user perspective, it has been observed that consumers lose interest in using HEMS after three to four months. Therefore, to further help in better energy management practices, the new system had to be designed in a way that consumers would sustain their interaction with the system on a structural basis. It is hypothesized that, if consumers feel more comfortable with using such system, it would lead to a prolonged usage, including more energy savings and hence financial savings. To test the hypothesis, a survey for the HEMS is conducted, to which 59 valid responses were recorded. Analysis of the survey helped in designing a system which imparts better information about the energy production and consumption to the consumers. It is also found from the survey that, consumers like a variety of options and they do not like a constant reminder of what they should do. Hence, the final system is designed to encourage consumers to make an informed decision about their energy usage with a wide variety of behavioral options available. It is envisaged that the new system will be tested in several pioneering smart energy grid projects in both the Netherlands and India, with a continued ‘design thinking’ approach, combining the technical and user perspective, as the basis for further improvements.

Keywords: weather forecasting, smart grid, renewable energy forecasting, user defined HEMS

Procedia PDF Downloads 218
11662 Influence of Spelling Errors on English Language Performance among Learners with Dysgraphia in Public Primary Schools in Embu County, Kenya

Authors: Madrine King'endo

Abstract:

This study dealt with the influence of spelling errors on English language performance among learners with dysgraphia in public primary schools in West Embu, Embu County, Kenya. The study purposed to investigate the influence of spelling errors on the English language performance among the class three pupils with dysgraphia in public primary schools. The objectives of the study were to identify the spelling errors that learners with dysgraphia make when writing English words and classify the spelling errors they make. Further, the study will establish how the spelling errors affect the performance of the language among the study participants, and suggest the remediation strategies that teachers could use to address the errors. The study could provide the stakeholders with relevant information in writing skills that could help in developing a responsive curriculum to accommodate the teaching and learning needs of learners with dysgraphia, and probably ensure training of teachers in teacher training colleges is tailored within the writing needs of the pupils with dysgraphia. The study was carried out in Embu county because the researcher did not find any study in related literature review concerning the influence of spelling errors on English language performance among learners with dysgraphia in public primary schools done in the area. Moreover, besides being relatively populated enough for a sample population of the study, the area was fairly cosmopolitan to allow a generalization of the study findings. The study assumed the sampled schools will had class three pupils with dysgraphia who exhibited written spelling errors. The study was guided by two spelling approaches: the connectionist stimulation of spelling process and orthographic autonomy hypothesis with a view to explain how participants with learning disabilities spell written words. Data were collected through interviews, pupils’ exercise books, and progress records, and a spelling test made by the researcher based on the spelling scope set for class three pupils by the ministry of education in the primary education syllabus. The study relied on random sampling techniques in identifying general and specific participants. Since the study used children in schools as participants, voluntary consent was sought from themselves, their teachers and the school head teachers who were their caretakers in a school setting.

Keywords: dysgraphia, writing, language, performance

Procedia PDF Downloads 141
11661 A Comprehensive Study of Spread Models of Wildland Fires

Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran

Abstract:

These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.

Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling

Procedia PDF Downloads 68
11660 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation

Authors: Rashmi Malik, Videep Mishra

Abstract:

The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.

Keywords: iterative game design, generative design, gaming asset automation, generative game design

Procedia PDF Downloads 55
11659 Quality Assurance in Higher Education: Doha Institute for Graduate Studies as a Case Study

Authors: Ahmed Makhoukh

Abstract:

Quality assurance (QA) has recently become a common practice, which is endorsed by most Higher Education (HE) institutions worldwide, due to the pressure of internal and external forces. One of the aims of this quality movement is to make the contribution of university education to socio-economic development highly significant. This entails that graduates are currently required have a high-quality profile, i.e., to be competent and master the 21st-century skills needed in the labor market. This wave of change, mostly imposed by globalization, has the effect that university education should be learner-centered in order to satisfy the different needs of students and meet the expectations of other stakeholders. Such a shift of focus on the student learning outcomes has led HE institutions to reconsider their strategic planning, their mission, the curriculum, the pedagogical competence of the academic staff, among other elements. To ensure that the overall institutional performance is on the right way, a QA system should be established to assume this task of checking regularly the extent to which the set of standards of evaluation are strictly respected as expected. This operation of QA has the advantage of proving the accountability of the institution, gaining the trust of the public with transparency and enjoying an international recognition. This is the case of Doha Institute (DI) for Graduate Studies, in Qatar, the object of the present study. The significance of this contribution is to show that the conception of quality has changed in this digital age, and the need to integrate a department responsible for QA in every HE institution to ensure educational quality, enhance learners and achieve academic leadership. Thus, to undertake the issue of QA in DI for Graduate Studies, an elite university (in the academic sense) that focuses on a small and selected number of students, a qualitative method will be adopted in the description and analysis of the data (document analysis). In an attempt to investigate the extent to which QA is achieved in Doha Institute for Graduate Studies, three broad indicators will be evaluated (input, process and learning outcomes). This investigation will be carried out in line with the UK Quality Code for Higher Education represented by Quality Assurance Agency (QAA).

Keywords: accreditation, higher education, quality, quality assurance, standards

Procedia PDF Downloads 137
11658 Subtitling in the Classroom: Combining Language Mediation, ICT and Audiovisual Material

Authors: Rossella Resi

Abstract:

This paper describes a project carried out in an Italian school with English learning pupils combining three didactic tools which are attested to be relevant for the success of young learner’s language curriculum: the use of technology, the intralingual and interlingual mediation (according to CEFR) and the cultural dimension. Aim of this project was to test a technological hands-on translation activity like subtitling in a formal teaching context and to exploit its potential as motivational tool for developing listening and writing, translation and cross-cultural skills among language learners. The activities proposed involved the use of professional subtitling software called Aegisub and culture-specific films. The workshop was optional so motivation was entirely based on the pleasure of engaging in the use of a realistic subtitling program and on the challenge of meeting the constraints that a real life/work situation might involve. Twelve pupils in the age between 16 and 18 have attended the afternoon workshop. The workshop was organized in three parts: (i) An introduction where the learners were opened up to the concept and constraints of subtitling and provided with few basic rules on spotting and segmentation. During this session learners had also the time to familiarize with the main software features. (ii) The second part involved three subtitling activities in plenum or in groups. In the first activity the learners experienced the technical dimensions of subtitling. They were provided with a short video segment together with its transcription to be segmented and time-spotted. The second activity involved also oral comprehension. Learners had to understand and transcribe a video segment before subtitling it. The third activity embedded a translation activity of a provided transcription including segmentation and spotting of subtitles. (iii) The workshop ended with a small final project. At this point learners were able to master a short subtitling assignment (transcription, translation, segmenting and spotting) on their own with a similar video interview. The results of these assignments were above expectations since the learners were highly motivated by the authentic and original nature of the assignment. The subtitled videos were evaluated and watched in the regular classroom together with other students who did not take part to the workshop.

Keywords: ICT, L2, language learning, language mediation, subtitling

Procedia PDF Downloads 401
11657 Biosurfactant: A Greener Approach for Enhanced Concrete Rheology and Strength

Authors: Olivia Anak Rayeg, Clotilda Binti Petrus, Arnel Reanturco Ascotia, Ang Chung Huap, Caroline Marajan, Rudy Tawie Joseph Sipi

Abstract:

Concrete is essential for global infrastructure, yet enhancing its rheology and strength in an environmentally sustainable manner remains a significant challenge. Conventional chemical admixtures often pose environmental and health risks. This study explores the use of a phospholipid biosurfactant, derived from Rhizopus oryzae, as an environmentally friendly admixture in concrete. Various concentrations of the biosurfactant were integrated into fresh concrete, partially replacing the water content. The inclusion of the biosurfactant markedly enhanced the workability of the concrete, as demonstrated by Vertical Slump, Slump Flow, and T50 tests. After a 28-day curing period, the concrete's mechanical properties were assessed through compressive strength and bonding tests. Results revealed that substituting up to 10% of the water with the biosurfactant not only improved workability but also significantly increased both compressive and flexural strength. These findings highlight the potential of phospholipid biosurfactant as a biodegradable and non-toxic alternative to traditional admixtures, enhancing both structural integrity and sustainability in concrete. This approach reduces environmental impact and production costs, marking a significant advancement in sustainable construction technology.

Keywords: concrete rheology, green admixture, fungal biosurfactant, phospholipids, rhizopus oryzae

Procedia PDF Downloads 20
11656 Comparison between Approaches Used in Two Walk About Projects

Authors: Derek O Reilly, Piotr Milczarski, Shane Dowdall, Artur Hłobaż, Krzysztof Podlaski, Hiram Bollaert

Abstract:

Learning through creation of contextual games is a very promising way/tool for interdisciplinary and international group projects. During 2013 and 2014 we took part and organized two intensive students projects in different conditions. The projects enrolled 68 students and 12 mentors from 5 countries. In the paper we want to share our experience how to strengthen the chances to succeed in short (12-15 days long) student projects. In our case almost all teams prepared working prototype and the results were highly appreciated by external experts.

Keywords: contextual games, mobile games, GGULIVRR, walkabout, Erasmus intensive programme

Procedia PDF Downloads 487
11655 Interdisciplinary Integrated Physical Education Program Using a Philosophical Approach

Authors: Ellie Abdi, Susana Juniu

Abstract:

The purpose of this presentation is to describe an interdisciplinary teaching program that integrates physical education concepts using a philosophical approach. The presentation includes a review of: a) the philosophy of American education, b) the philosophy of sports and physical education, c) the interdisciplinary physical education program, d) professional development programs, (e) the Success of this physical education program, f) future of physical education. This unique interdisciplinary program has been implemented in an urban school physical education discipline in East Orange, New Jersey for over 10 years. During the program the students realize that the bodies go through different experiences. The body becomes a place where a child can recognize in an enjoyable way to express and perceive particular feelings or mental states. Children may distinguish themselves to have high abilities in the social or other domains but low abilities in the field of athletics. The goal of this program for the individuals is to discover new skills, develop and demonstrate age appropriate mastery level at different tasks, therefore the program consists of 9 to 12 sports, including many game. Each successful experience increases the awareness ability. Engaging in sports and physical activities are social movements involving groups of children in situations such as teams, friends, and recreational settings, which serve as a primary socializing agent for teaching interpersonal skills. As a result of this presentation the audience will reflect and explore how to structure a physical education program to integrate interdisciplinary subjects with philosophical concepts.

Keywords: interdisciplinary disciplines, philosophical concepts, physical education, interdisciplinary teaching program

Procedia PDF Downloads 482
11654 Genetic Identification of Crop Cultivars Using Barcode System

Authors: Kesavan Markkandan, Ha Young Park, Seung-Il Yoo, Sin-Gi Park, Junhyung Park

Abstract:

For genetic identification of crop cultivars, insertions/deletions (InDel) markers have been preferred currently because they are easy to use, PCR based, co-dominant and relatively abundant. However, new InDels need to be developed for genetic studies of new varieties due to the difference of allele frequencies in InDels among the population groups. These new varieties are evolved with low levels of genetic diversity in specific genome loci with high recombination rate. In this study, we described soybean barcode system approach based on InDel makers, each of which is specific to a variation block (VB), where the genomes split by all assumed recombination sites. Firstly, VBs in crop cultivars were mined for transferability to VB-specific InDel markers. Secondly, putative InDels in the VB regions were identified for the development of barcode system by analyzing particular cultivar’s whole genome data. Thirdly, common VB-specific InDels from all cultivars were selected by gel electrophoresis, which were converted as 2D barcode types according to comparing amplicon polymorphisms in the five cultivars to the reference cultivar. Finally, the polymorphism of the selected markers was assessed with other cultivars, and the barcode system that allows a clear distinction among those cultivars is described. The same approach can be applicable for other commercial crops. Hence, VB-based genetic identification not only minimize the molecular markers but also useful for assessing cultivars and for marker-assisted breeding in other crop species.

Keywords: variation block, polymorphism, InDel marker, genetic identification

Procedia PDF Downloads 367
11653 Participatory Air Quality Monitoring in African Cities: Empowering Communities, Enhancing Accountability, and Ensuring Sustainable Environments

Authors: Wabinyai Fidel Raja, Gideon Lubisa

Abstract:

Air pollution is becoming a growing concern in Africa due to rapid industrialization and urbanization, leading to implications for public health and the environment. Establishing a comprehensive air quality monitoring network is crucial to combat this issue. However, conventional methods of monitoring are insufficient in African cities due to the high cost of setup and maintenance. To address this, low-cost sensors (LCS) can be deployed in various urban areas through the use of participatory air quality network siting (PAQNS). PAQNS involves stakeholders from the community, local government, and private sector working together to determine the most appropriate locations for air quality monitoring stations. This approach improves the accuracy and representativeness of air quality monitoring data, engages and empowers community members, and reflects the actual exposure of the population. Implementing PAQNS in African cities can build trust, promote accountability, and increase transparency in the air quality management process. However, challenges to implementing this approach must be addressed. Nonetheless, improving air quality is essential for protecting public health and promoting a sustainable environment. Implementing participatory and data-informed air quality monitoring can take a significant step toward achieving these important goals in African cities and beyond.

Keywords: low-cost sensors, participatory air quality network siting, air pollution, air quality management

Procedia PDF Downloads 75
11652 Exploring the Influence of Culture on Dietary Practices and Ethnic Inequality in Health among Migrant Nigerians in the UK

Authors: Babatunde Johnson

Abstract:

The rate of diseases and death from preventable diseases among ethnic minority groups is high when compared with the wider white population in the UK. This can be due in part to the diet consumed and various cultural reasons. Changes in dietary practices and the health of ethnic minority groups can be caused by the adoption of food practices of the host culture after migration (acculturation) and generational differences among migrants. However, understanding how and why these changes occur is limited due to the challenges of data collection in research. This research utilizes the interpretive phenomenological approach, coupled with Bourdieu’s theory used as the conceptual framework, and seeks an in-depth understanding of how adult immigrant Nigerians in the UK interpret their experience of the influence of ethnic and prevailing cultures on their dietary practice. Recruiting participants from a close-knit community, such as the Nigerian population in the UK, can be complex and problematic and is determined by the accessibility to the community. Although complex, the researcher leveraged the principles of Patient and Public Involvement (PPI) in gaining access to participants within the Nigerian community. This study emphasizes the need for a culturally tailored and community-centered approach to interventions geared toward the reduction of ethnic health inequality in the UK other than the existing practice, which focuses on better healthy eating through the improvement of skills and knowledge about food groups.

Keywords: culture, dietary practice, ethnic minority, health inequality

Procedia PDF Downloads 71
11651 Analysis of Bed Load Sediment Transport Mataram-Babarsari Irrigation Canal

Authors: Agatha Padma Laksitaningtyas, Sumiyati Gunawan

Abstract:

Mataram Irrigation Canal has 31,2 km length, is the main irrigation canal in Special Region Province of Yogyakarta, connecting Progo River on the west side and Opak River on the east side. It has an important role as the main water carrier distribution for various purposes such as agriculture, fishery, and plantation which should be free from sediment material. Bed Load Sediment is the basic sediment that will make the sediment process on the irrigation canal. Sediment process is a simultaneous event that can make deposition sediment at the base of irrigation canal and can make the height of elevation water change, it will affect the availability of water to be used for irrigation functions. To predict the amount of drowning sediments in the irrigation canal using two methods: Meyer-Peter and Muller’s Method which is an energy approach method and Einstein Method which is a probabilistic approach. Speed measurement using floating method and using current meters. The channel geometry is measured directly in the field. The basic sediment of the channel is taken in the field by taking three samples from three different points. The result of the research shows that by using the formula Meyer -Peter Muller get the result of 60,75799 kg/s, whereas with Einsten’s Method get result of 13,06461 kg/s. the results may serve as a reference for dredging the sediments on the channel so as not to disrupt the flow of water in irrigation canal.

Keywords: bed load, sediment, irrigation, Mataram canal

Procedia PDF Downloads 212
11650 Parametric Modeling for Survival Data with Competing Risks Using the Generalized Gompertz Distribution

Authors: Noora Al-Shanfari, M. Mazharul Islam

Abstract:

The cumulative incidence function (CIF) is a fundamental approach for analyzing survival data in the presence of competing risks, which estimates the marginal probability for each competing event. Parametric modeling of CIF has the advantage of fitting various shapes of CIF and estimates the impact of covariates with maximum efficiency. To calculate the total CIF's covariate influence using a parametric model., it is essential to parametrize the baseline of the CIF. As the CIF is an improper function by nature, it is necessary to utilize an improper distribution when applying parametric models. The Gompertz distribution, which is an improper distribution, is limited in its applicability as it only accounts for monotone hazard shapes. The generalized Gompertz distribution, however, can adapt to a wider range of hazard shapes, including unimodal, bathtub, and monotonic increasing or decreasing hazard shapes. In this paper, the generalized Gompertz distribution is used to parametrize the baseline of the CIF, and the parameters of the proposed model are estimated using the maximum likelihood approach. The proposed model is compared with the existing Gompertz model using the Akaike information criterion. Appropriate statistical test procedures and model-fitting criteria will be used to test the adequacy of the model. Both models are applied to the ‘colon’ dataset, which is available in the “biostat3” package in R.

Keywords: competing risks, cumulative incidence function, improper distribution, parametric modeling, survival analysis

Procedia PDF Downloads 75
11649 Clustered Regularly Interspaced Short Palindromic Repeat/cas9-Based Lateral Flow and Fluorescence Diagnostics for Rapid Pathogen Detection

Authors: Mark Osborn

Abstract:

Clustered, regularly interspaced short palindromic repeat (CRISPR/Cas) proteins can be designed to bind specified DNA and RNA sequences and hold great promise for the accurate detection of nucleic acids for diagnostics. Commercially available reagents were integrated into a CRISPR/Cas9-based lateral flow assay that can detect severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequences with single-base specificity. This approach requires minimal equipment and represents a simplified platform for field-based deployment. A rapid, multiplex fluorescence CRISPR/Cas9 nuclease cleavage assay capable of detecting and differentiating SARS-CoV-2, influenza A and B, and respiratory syncytial virus in a single reaction was also developed. These findings provide proof of principle for CRISPR/Cas9 point-of-care diagnosis that can detect specific SARS-CoV-2 strain(s). Further, Cas9 cleavage allows for a scalable fluorescent platform for identifying respiratory viral pathogens with overlapping symptomology. Collectively, this approach is a facile platform for diagnostics with broad application to user-defined sequence interrogation and detection.

Keywords: CRISPR/Cas9, lateral flow assay, SARS-Co-V2, single-nucleotide resolution

Procedia PDF Downloads 171
11648 Extended Knowledge Exchange with Industrial Partners: A Case Study

Authors: C. Fortin, D. Tokmeninova, O. Ushakova

Abstract:

Among 500 Russian universities Skolkovo Institute of Science and Technology (Skoltech) is one of the youngest (established in 2011), quite small and vastly international, comprising 20 percent of international students and 70 percent of faculty with significant academic experience at top-100 universities (QS, THE). The institute has emerged from close collaboration with MIT and leading Russian universities. Skoltech is an entirely English speaking environment. Skoltech curriculum plans of ten Master programs are based on the CDIO learning outcomes model. However, despite the Institute’s unique focus on industrial innovations and startups, one of the main challenges has become an evident large proportion of nearly half of MSc graduates entering PhD programs at Skoltech or other universities rather than industry or entrepreneurship. In order to increase the share of students joining the industrial sector after graduation, Skoltech started implementing a number of unique practices with a focus on employers’ expectations incorporated into the curriculum redesign. In this sense, extended knowledge exchange with industrial partners via collaboration in learning activities, industrial projects and assessments became essential for students’ headway into industrial and entrepreneurship pathways. Current academic curriculum includes the following types of components based on extended knowledge exchange with industrial partners: innovation workshop, industrial immersion, special industrial tracks, MSc defenses. Innovation workshop is a 4 week full time diving into the Skoltech vibrant ecosystem designed to foster innovators, focuses on teamwork, group projects, and sparks entrepreneurial instincts from the very first days of study. From 2019 the number of mentors from industry and startups significantly increased to guide students across these sectors’ demands. Industrial immersion is an exclusive part of Skoltech curriculum where students after the first year of study spend 8 weeks in an industrial company carrying out an individual or team project and are guided jointly by both Skoltech and company supervisors. The aim of the industrial immersion is to familiarize students with relevant needs of Russian industry and to prepare graduates for job placement. During the immersion a company plays the role of a challenge provider for students. Skoltech has started a special industrial track comprising deep collaboration with IPG Photonics – a leading R&D company and manufacturer of high-performance fiber lasers and amplifiers for diverse applications. The track is aimed to train a new cohort of engineers and includes a variety of activities for students within the “Photonics” MSc program. It is expected to be a successful story and used as an example for similar initiatives with other Russian high-tech companies. One of the pathways of extended knowledge exchange with industrial partners is an active involvement of potential employers in MSc Defense Committees to review and assess MSc thesis projects and to participate in defense procedures. The paper will evaluate the effect and results of the above undertaken measures.

Keywords: Curriculum redesign, knowledge exchange model, learning outcomes framework, stakeholder engagement

Procedia PDF Downloads 68
11647 Plethora of Drivers Transforming Colonial Cities: The Case of Allahabad

Authors: Akanksha Gupta, Vishal Dubey

Abstract:

In the Neoliberal era, there has been a much-talked discourse about urban issues that arise from a narrow approach of the single rationality of market-driven planning in Indian cities. More to this, India's urban planning is already jeopardized by the captious shortage of infrastructure, a cluster of incoherent governing bodies and implementation mechanism, leading cities to lie in the plethora of urban challenges. In this context, Allahabad (now known as Prayagraj) a city in North India is not an exception. Once known as the most planned splendid Colonial city of the British regime in India collapsed phenomenally because of the incompetent approach of planning machinery, straightforward market-driven accession and lack of attention on urban equity and sustainability. Particularly Civil Lines a Colonial neighbourhood, reached to the zenith of the glorified legacy of the Colonial era, transformed into filthy and congested urban form. Contextually this study contemplates and assesses the chronological episodes of major changes in land management reforms and policies under the ad hoc approach of political economy and land use planning which radically degraded the living environment in the present context. This study would empirically showcase the selected sample area detailing some of the major consequences in terms of gradual change in urban morphology, land use, and function. Here the method of study is primarily a qualitative study implying oral history and other historical methods to exhibit the idiom of planning conundrum. This subsequently reflects the repercussions translated into major issues like unclear land titles, encroachment, and unauthorized development and mushrooming of informal and squatter settlements. In nutshell, the study seeks to distinct out the limitations of the land reform and land management policies, which impacted the general degradation to the beautiful setting of Colonial neighbourhood. The Colonial legacy of Civil Lines now exists in the traces of history- memories of people, who once took pride in its serenity have now witnessed the transformation bit by bit till neo-liberal market forces completely swallow it.

Keywords: civil lines, land reforms, policies, urban challenges

Procedia PDF Downloads 108
11646 Microfinance for the Marginalised: The Impact of the Rojiroti Approach in India

Authors: Gil Yaron, Rebecca Gordon, John Best, Sunil Choudhary

Abstract:

There have been a number of studies examining the impact of microfinance; however, the magnitude of impact varies across regions, and there has been mixed evidence due to the differences in the nature of interventions, context and the way in which microfinance is implemented. The Rojiroti approach to microfinance involves the creation of women's self-help groups (SHGs), rotated loans from savings and subsequent credit from a Bihar-based NGO. Rojiroti serves customers who are significantly poorer and more marginalised than those typically served by microfinance in India. In the data analysed, more than 90 percent of members are from scheduled caste and tribes (62 percent) or other disadvantaged castes. This paper analyses the impact of Rojiroti microfinance using panel data on 740 new SHG members and 340 women in matched control sites at baseline and after 18 months. We consider changes in assets, children's education, women's mobility and domestic violence among other indicators. These results show significant gains for Rojiroti borrowers relative to control sites for important, but not all, variables. Comparison with more longstanding SHGs (at least 36 months) helps to explain how the borrowing patterns of poor and marginalised SHG members evolve. The context of this intervention is also important; in this case, innovative microfinance is provided too much poorer and marginalised women than is typically the case, and so the results seen are in contrast to numerous studies that show little or no effect of microfinance on the lives of their clients.

Keywords: microfinance, gender, impact, pro-poor

Procedia PDF Downloads 145
11645 LTE Modelling of a DC Arc Ignition on Cold Electrodes

Authors: O. Ojeda Mena, Y. Cressault, P. Teulet, J. P. Gonnet, D. F. N. Santos, MD. Cunha, M. S. Benilov

Abstract:

The assumption of plasma in local thermal equilibrium (LTE) is commonly used to perform electric arc simulations for industrial applications. This assumption allows to model the arc using a set of magneto-hydromagnetic equations that can be solved with a computational fluid dynamic code. However, the LTE description is only valid in the arc column, whereas in the regions close to the electrodes the plasma deviates from the LTE state. The importance of these near-electrode regions is non-trivial since they define the energy and current transfer between the arc and the electrodes. Therefore, any accurate modelling of the arc must include a good description of the arc-electrode phenomena. Due to the modelling complexity and computational cost of solving the near-electrode layers, a simplified description of the arc-electrode interaction was developed in a previous work to study a steady high-pressure arc discharge, where the near-electrode regions are introduced at the interface between arc and electrode as boundary conditions. The present work proposes a similar approach to simulate the arc ignition in a free-burning arc configuration following an LTE description of the plasma. To obtain the transient evolution of the arc characteristics, appropriate boundary conditions for both the near-cathode and the near-anode regions are used based on recent publications. The arc-cathode interaction is modeled using a non-linear surface heating approach considering the secondary electron emission. On the other hand, the interaction between the arc and the anode is taken into account by means of the heating voltage approach. From the numerical modelling, three main stages can be identified during the arc ignition. Initially, a glow discharge is observed, where the cold non-thermionic cathode is uniformly heated at its surface and the near-cathode voltage drop is in the order of a few hundred volts. Next, a spot with high temperature is formed at the cathode tip followed by a sudden decrease of the near-cathode voltage drop, marking the glow-to-arc discharge transition. During this stage, the LTE plasma also presents an important increase of the temperature in the region adjacent to the hot spot. Finally, the near-cathode voltage drop stabilizes at a few volts and both the electrode and plasma temperatures reach the steady solution. The results after some seconds are similar to those presented for thermionic cathodes.

Keywords: arc-electrode interaction, thermal plasmas, electric arc simulation, cold electrodes

Procedia PDF Downloads 107
11644 Filtering Intrusion Detection Alarms Using Ant Clustering Approach

Authors: Ghodhbani Salah, Jemili Farah

Abstract:

With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.

Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms

Procedia PDF Downloads 391