Search results for: sustainability assessment framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3100

Search results for: sustainability assessment framework

460 Conceptual Model for Massive Open Online Blended Courses Based on Disciplines’ Concepts Capitalization and Obstacles’ Detection

Authors: N. Hammid, F. Bouarab-Dahmani, T. Berkane

Abstract:

Since its appearance, the MOOC (massive open online course) is gaining more and more intention of the educational communities over the world. Apart from the current MOOCs design and purposes, the creators of MOOC focused on the importance of the connection and knowledge exchange between individuals in learning. In this paper, we present a conceptual model for massive open online blended courses where teachers over the world can collaborate and exchange their experience to get a common efficient content designed as a MOOC opened to their students to live a better learning experience. This model is based on disciplines’ concepts capitalization and the detection of the obstacles met by their students when faced with problem situations (exercises, projects, case studies, etc.). This detection is possible by analyzing the frequently of semantic errors committed by the students. The participation of teachers in the design of the course and the attendance by their students can guarantee an efficient and extensive participation (an important number of participants) in the course, the learners’ motivation and the evaluation issues, in the way that the teachers designing the course assess their students. Thus, the teachers review, together with their knowledge, offer a better assessment and efficient connections to their students.

Keywords: MOOC, Massive Open Online Courses, Online learning, E-learning, Blended learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 920
459 Ground Motion Modelling in Bangladesh Using Stochastic Method

Authors: Mizan Ahmed, Srikanth Venkatesan

Abstract:

Geological and tectonic framework indicates that Bangladesh is one of the most seismically active regions in the world. The Bengal Basin is at the junction of three major interacting plates: the Indian, Eurasian, and Burma Plates. Besides there are many active faults within the region, e.g. the large Dauki fault in the north. The country has experienced a number of destructive earthquakes due to the movement of these active faults. Current seismic provisions of Bangladesh are mostly based on earthquake data prior to the 1990. Given the record of earthquakes post 1990, there is a need to revisit the design provisions of the code. This paper compares the base shear demand of three major cities in Bangladesh: Dhaka (the capital city), Sylhet, and Chittagong for earthquake scenarios of magnitudes 7.0MW, 7.5MW, 8.0MW, and 8.5MW using a stochastic model. In particular, the stochastic model allows the flexibility to input region specific parameters such as shear wave velocity profile (that were developed from Global Crustal Model CRUST2.0) and include the effects of attenuation as individual components. Effects of soil amplification were analysed using the Extended Component Attenuation Model (ECAM). Results show that the estimated base shear demand is higher in comparison with code provisions leading to the suggestion of additional seismic design consideration in the study regions.

Keywords: Attenuation, earthquake, ground motion, stochastic, seismic hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022
458 Fuzzy Join Dependency in Fuzzy Relational Databases

Authors: P. C. Saxena, D. K. Tayal

Abstract:

The join dependency provides the basis for obtaining lossless join decomposition in a classical relational schema. The existence of Join dependency shows that that the tables always represent the correct data after being joined. Since the classical relational databases cannot handle imprecise data, they were extended to fuzzy relational databases so that uncertain, ambiguous, imprecise and partially known information can also be stored in databases in a formal way. However like classical databases, the fuzzy relational databases also undergoes decomposition during normalization, the issue of joining the decomposed fuzzy relations remains intact. Our effort in the present paper is to emphasize on this issue. In this paper we define fuzzy join dependency in the framework of type-1 fuzzy relational databases & type-2 fuzzy relational databases using the concept of fuzzy equality which is defined using fuzzy functions. We use the fuzzy equi-join operator for computing the fuzzy equality of two attribute values. We also discuss the dependency preservation property on execution of this fuzzy equi- join and derive the necessary condition for the fuzzy functional dependencies to be preserved on joining the decomposed fuzzy relations. We also derive the conditions for fuzzy join dependency to exist in context of both type-1 and type-2 fuzzy relational databases. We find that unlike the classical relational databases even the existence of a trivial join dependency does not ensure lossless join decomposition in type-2 fuzzy relational databases. Finally we derive the conditions for the fuzzy equality to be non zero and the qualification of an attribute for fuzzy key.

Keywords: Fuzzy - equi join, fuzzy functions, fuzzy join dependency, type-1 fuzzy relational database, type-2 fuzzy relational database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2008
457 Hacking's 'Between Goffman and Foucault': A Theoretical Frame for Criminology

Authors: Tomás Speziale

Abstract:

This paper aims to analyse how Ian Hacking states the theoretical basis of his research on the classification of people. Although all his early philosophical education had been based in Foucault, it is also true that Erving Goffman’s perspective provided him with epistemological and methodological tools for understanding face-to-face relationships. Hence, all his works must be thought of as social science texts that combine the research on how the individuals are constituted ‘top-down’ (as in Foucault), with the inquiry into how people renegotiate ‘bottom-up’ the classifications about them. Thus, Hacking´s proposal constitutes a middle ground between the French Philosopher and the American Sociologist. Placing himself between both authors allows Hacking to build a frame that is expected to adjust to Social Sciences’ main particularity: the fact that they study interactive kinds. These are kinds of people, which imply that those who are classified can change in certain ways that prompt the need for changing previous classifications themselves. It is all about the interaction between the labelling of people and the people who are classified. Consequently, understanding the way in which Hacking uses Foucault’s and Goffman’s theories is essential to fully comprehend the social dynamic between individuals and concepts, what Bert Hansen had called dialectical realism. His theoretical proposal, therefore, is not only valuable because it combines diverse perspectives, but also because it constitutes an utterly original and relevant framework for Sociological theory and particularly for Criminology.

Keywords: Classification of people, Foucault`s archaeology, Goffman`s interpersonal sociology, interactive kinds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
456 Rotorcraft Performance and Environmental Impact Evaluation by Multidisciplinary Modelling

Authors: Pierre-Marie Basset, Gabriel Reboul, Binh DangVu, Sébastien Mercier

Abstract:

Rotorcraft provides invaluable services thanks to their Vertical Take-Off and Landing (VTOL), hover and low speed capabilities. Yet their use is still often limited by their cost and environmental impact, especially noise and energy consumption. One of the main brakes to the expansion of the use of rotorcraft for urban missions is the environmental impact. The first main concern for the population is the noise. In order to develop the transversal competency to assess the rotorcraft environmental footprint, a collaboration has been launched between six research departments within ONERA. The progress in terms of models and methods are capitalized into the numerical workshop C.R.E.A.T.I.O.N. “Concepts of Rotorcraft Enhanced Assessment Through Integrated Optimization Network”. A typical mission for which the environmental impact issue is of great relevance has been defined. The first milestone is to perform the pre-sizing of a reference helicopter for this mission. In a second milestone, an alternate rotorcraft concept has been defined: a tandem rotorcraft with optional propulsion. The key design trends are given for the pre-sizing of this rotorcraft aiming at a significant reduction of the global environmental impact while still giving equivalent flight performance and safety with respect to the reference helicopter. The models and methods have been improved for catching sooner and more globally, the relative variations on the environmental impact when changing the rotorcraft architecture, the pre-design variables and the operation parameters.

Keywords: Environmental impact, flight performance, helicopter, rotorcraft pre-sizing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
455 Cultivating Focal Firm-s Supply Chain Process Integration Capabilities: The Investigation of Critical Determinants and Consequences

Authors: Chun-Der Chen, Yi-Wen Fan, Cheng-Kiang Farn

Abstract:

In today-s competitive global business environment, the concept of supply chain management (SCM) continues to become increasingly market-oriented, shifting the primary driver of the value chain from supply to demand. Recent recommendations encourage researchers to focus investigations on the supply chain process integration (SCPI) capabilities that integrate a focal firm with its network of suppliers and business customers to create value for it. However, theoretical and empirical researches pertaining to the antecedents and consequences of a focal firm-s SCPI capabilities have been limited and piecemeal. The purpose of this study is to investigate the critical determinants and consequences of a focal firm-s SCPI capabilities. We test our proposed research framework using a sample of 139 sales managers of manufacturing industries in Taiwan, our research findings show that (1) both perceived business customer-s power and focal firm-s market-oriented culture positively influences a focal firm-s SCPI capabilities, and (2) SCPI capabilities positively influence a focal firm-s SCM performance, both operational and strategic benefits. Implications for practitioners and researchers and suggestions for future research are also addressed in this study.

Keywords: Supply chain process integration capabilities, Perceived business customer's power, Market-oriented culture, Supply chain management performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3285
454 Catalytic Gasification of Olive Mill Wastewater as a Biomass Source under Supercritical Conditions

Authors: Ekin Kıpçak, Mesut Akgün

Abstract:

Recently, a growing interest has emerged on the development of new and efficient energy sources, due to the inevitable extinction of the nonrenewable energy reserves. One of these alternative sources which have a great potential and sustainability to meet up the energy demand is biomass energy. This significant energy source can be utilized with various energy conversion technologies, one of which is biomass gasification in supercritical water.

Water, being the most important solvent in nature, has very important characteristics as a reaction solvent under supercritical circumstances. At temperatures above its critical point (374.8oC and 22.1MPa), water becomes more acidic and its diffusivity increases. Working with water at high temperatures increases the thermal reaction rate, which in consequence leads to a better dissolving of the organic matters and a fast reaction with oxygen. Hence, supercritical water offers a control mechanism depending on solubility, excellent transport properties based on its high diffusion ability and new reaction possibilities for hydrolysis or oxidation.

In this study the gasification of a real biomass, namely olive mill wastewater (OMW), in supercritical water conditions is investigated with the use of Ru/Al2O3 catalyst. OMW is a by-product obtained during olive oil production, which has a complex nature characterized by a high content of organic compounds and polyphenols. These properties impose OMW a significant pollution potential, but at the same time, the high content of organics makes OMW a desirable biomass candidate for energy production.

The catalytic gasification experiments were made with five different reaction temperatures (400, 450, 500, 550 and 600°C) and five reaction times (30, 60, 90, 120 and 150s), under a constant pressure of 25MPa. Through these experiments, the effects of reaction temperature and time on the gasification yield, gaseous product composition and OMW treatment efficiency were investigated.

Keywords: Catalyst, Gasification, Olive mill wastewater, Ru/Al2O3, Supercritical water.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2255
453 Optimization of Two Quality Characteristics in Injection Molding Processes via Taguchi Methodology

Authors: Joseph C. Chen, Venkata Karthik Jakka

Abstract:

The main objective of this research is to optimize tensile strength and dimensional accuracy in injection molding processes using Taguchi Parameter Design. An L16 orthogonal array (OA) is used in Taguchi experimental design with five control factors at four levels each and with non-controllable factor vibration. A total of 32 experiments were designed to obtain the optimal parameter setting for the process. The optimal parameters identified for the shrinkage are shot volume, 1.7 cubic inch (A4); mold term temperature, 130 ºF (B1); hold pressure, 3200 Psi (C4); injection speed, 0.61 inch3/sec (D2); and hold time of 14 seconds (E2). The optimal parameters identified for the tensile strength are shot volume, 1.7 cubic inch (A4); mold temperature, 160 ºF (B4); hold pressure, 3100 Psi (C3); injection speed, 0.69 inch3/sec (D4); and hold time of 14 seconds (E2). The Taguchi-based optimization framework was systematically and successfully implemented to obtain an adjusted optimal setting in this research. The mean shrinkage of the confirmation runs is 0.0031%, and the tensile strength value was found to be 3148.1 psi. Both outcomes are far better results from the baseline, and defects have been further reduced in injection molding processes.

Keywords: Injection molding processes, Taguchi Parameter Design, tensile strength, shrinkage test, high-density polyethylene, HDPE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 816
452 Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops: Statistical Evaluation of the Potential Herbicide Savings

Authors: Morten Stigaard Laursen, Rasmus Nyholm Jørgensen, Henrik Skov Midtiby, Anders Krogh Mortensen, Sanmohan Baby

Abstract:

This work contributes a statistical model and simulation framework yielding the best estimate possible for the potential herbicide reduction when using the MoDiCoVi algorithm all the while requiring a efficacy comparable to conventional spraying. In June 2013 a maize field located in Denmark were seeded. The field was divided into parcels which was assigned to one of two main groups: 1) Control, consisting of subgroups of no spray and full dose spraty; 2) MoDiCoVi algorithm subdivided into five different leaf cover thresholds for spray activation. In addition approximately 25% of the parcels were seeded with additional weeds perpendicular to the maize rows. In total 299 parcels were randomly assigned with the 28 different treatment combinations. In the statistical analysis, bootstrapping was used for balancing the number of replicates. The achieved potential herbicide savings was found to be 70% to 95% depending on the initial weed coverage. However additional field trials covering more seasons and locations are needed to verify the generalisation of these results. There is a potential for further herbicide savings as the time interval between the first and second spraying session was not long enough for the weeds to turn yellow, instead they only stagnated in growth.

Keywords: Weed crop discrimination, macrosprayer, herbicide reduction, site-specific, sprayer-boom.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1026
451 Human Absorbed Dose Estimation of a New IN-111 Imaging Agent Based on Rat Data

Authors: H. Yousefnia, S. Zolghadri

Abstract:

The measurement of organ radiation exposure dose is one of the most important steps to be taken initially, for developing a new radiopharmaceutical. In this study, the dosimetric studies of a novel agent for SPECT-imaging of the bone metastasis, 111In- 1,4,7,10-tetraazacyclododecane-1,4,7,10 tetraethylene phosphonic acid (111In-DOTMP) complex, have been carried out to estimate the dose in human organs based on the data derived from rats. The radiolabeled complex was prepared with high radiochemical purity in the optimal conditions. Biodistribution studies of the complex was investigated in the male Syrian rats at selected times after injection (2, 4, 24 and 48 h). The human absorbed dose estimation of the complex was made based on data derived from the rats by the radiation absorbed dose assessment resource (RADAR) method. 111In-DOTMP complex was prepared with high radiochemical purity of >99% (ITLC). Total body effective absorbed dose for 111In- DOTMP was 0.061 mSv/MBq. This value is comparable to the other 111In clinically used complexes. The results show that the dose with respect to the critical organs is satisfactory within the acceptable range for diagnostic nuclear medicine procedures. Generally, 111In- DOTMP has interesting characteristics and can be considered as a viable agent for SPECT-imaging of the bone metastasis in the near future.

Keywords: In-111, DOTMP, Internal Dosimetry, RADAR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937
450 Artificial Intelligence-Based Chest X-Ray Test of COVID-19 Patients

Authors: Dhurgham Al-Karawi, Nisreen Polus, Shakir Al-Zaidi, Sabah Jassim

Abstract:

The management of COVID-19 patients based on chest imaging is emerging as an essential tool for evaluating the spread of the pandemic which has gripped the global community. It has already been used to monitor the situation of COVID-19 patients who have issues in respiratory status. There has been increase to use chest imaging for medical triage of patients who are showing moderate-severe clinical COVID-19 features, this is due to the fast dispersal of the pandemic to all continents and communities. This article demonstrates the development of machine learning techniques for the test of COVID-19 patients using Chest X-Ray (CXR) images in nearly real-time, to distinguish the COVID-19 infection with a significantly high level of accuracy. The testing performance has covered a combination of different datasets of CXR images of positive COVID-19 patients, patients with viral and bacterial infections, also, people with a clear chest. The proposed AI scheme successfully distinguishes CXR scans of COVID-19 infected patients from CXR scans of viral and bacterial based pneumonia as well as normal cases with an average accuracy of 94.43%, sensitivity 95%, and specificity 93.86%. Predicted decisions would be supported by visual evidence to help clinicians speed up the initial assessment process of new suspected cases, especially in a resource-constrained environment.

Keywords: COVID-19, chest x-ray scan, artificial intelligence, texture analysis, local binary pattern transform, Gabor filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 647
449 Moral Reasoning and Behaviour in Adulthood

Authors: O. Matarazzo, L. Abbamonte, G. Nigro

Abstract:

This study aimed at assessing whether and to what extent moral judgment and behaviour were: 1. situation-dependent; 2. selectively dependent on cognitive and affective components; 3. influenced by gender and age; 4. reciprocally congruent. In order to achieve these aims, four different types of moral dilemmas were construed and five types of thinking were presented for each of them – representing five possible ways to evaluate the situation. The judgment criteria included selfishness, altruism, sense of justice, and the conflict between selfishness and the two moral issues. The participants were 250 unpaid volunteers (50% male; 50% female) belonging to two age-groups: young people and adults. The study entailed a 2 (gender) x 2 (age-group) x 5 (type of thinking) x 4 (situation) mixed design: the first two variables were betweensubjects, the others were within-subjects. Results have shown that: 1. moral judgment and behaviour are at least partially affected by the type of situations and by interpersonal variables such as gender and age; 2. moral reasoning depends in a similar manner on cognitive and affective factors; 3. there is not a gender polarity between the ethic of justice and the ethic of cure/ altruism; 4. moral reasoning and behavior are perceived as reciprocally congruent even though their congruence decreases with a more objective assessment. Such results were discussed in the light of contrasting theories on morality.

Keywords: Contextual-pragmatic approach to morality, ethic ofcare, ethic of justice, Kohlbergian approach, moral behaviour, moralreasoning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2503
448 Assessment of Pier Foundations for Onshore Wind Turbines in Non-cohesive Soil

Authors: Mauricio Terceros, Jann-Eike Saathoff, Martin Achmus

Abstract:

In non-cohesive soil, onshore wind turbines are often found on shallow foundations with a circular or octagonal shape. For the current generation of wind turbines, shallow foundations with very large breadths are required. The foundation support costs thus represent a considerable portion of the total construction costs. Therefore, an economic optimization of the type of foundation is highly desirable. A conceivable alternative foundation type would be a pier foundation, which combines the load transfer over the foundation area at the pier base with the transfer of horizontal loads over the shaft surface of the pier. The present study aims to evaluate the load-bearing behavior of a pier foundation based on comprehensive parametric studies. Thereby, three-dimensional numerical simulations of both pier and shallow foundations are developed. The evaluation of the results focuses on the rotational stiffnesses of the proposed soil-foundation systems. In the design, the initial rotational stiffness is decisive for consideration of natural frequencies, whereas the rotational secant stiffness for a maximum load is decisive for serviceability considerations. A systematic analysis of the results at different load levels shows that the application of the typical pier foundation is presumably limited to relatively small onshore wind turbines.

Keywords: Onshore wind foundation, pier foundation, rotational stiffness of soil-foundation system, shallow foundation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 702
447 Ontology of Collaborative Supply Chain for Quality Management

Authors: Jiaqi Yan, Sherry Sun, Huaiqing Wang, Zhongsheng Hua

Abstract:

In the highly competitive and rapidly changing global marketplace, independent organizations and enterprises often come together and form a temporary alignment of virtual enterprise in a supply chain to better provide products or service. As firms adopt the systems approach implicit in supply chain management, they must manage the quality from both internal process control and external control of supplier quality and customer requirements. How to incorporate quality management of upstream and downstream supply chain partners into their own quality management system has recently received a great deal of attention from both academic and practice. This paper investigate the collaborative feature and the entities- relationship in a supply chain, and presents an ontology of collaborative supply chain from an approach of aligning service-oriented framework with service-dominant logic. This perspective facilitates the segregation of material flow management from manufacturing capability management, which provides a foundation for the coordination and integration of the business process to measure, analyze, and continually improve the quality of products, services, and process. Further, this approach characterizes the different interests of supply chain partners, providing an innovative approach to analyze the collaborative features of supply chain. Furthermore, this ontology is the foundation to develop quality management system which internalizes the quality management in upstream and downstream supply chain partners and manages the quality in supply chain systematically.

Keywords: Ontology, supply chain quality management, service-oriented architecture, service-dominant logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
446 Aerodynamic Design Optimization of High-Speed Hatchback Cars for Lucrative Commercial Applications

Authors: A. Aravind, M. Vetrivel, P. Abhimanyu, C. A. Akaash Emmanuel Raj, K. Sundararaj, V. R. S. Kumar

Abstract:

The choice of high-speed, low budget hatchback car with diversified options is increasing for meeting the new generation buyers trend. This paper is aimed to augment the current speed of the hatchback cars through the aerodynamic drag reduction technique. The inverted airfoils are facilitated at the bottom of the car for generating the downward force for negating the lift while increasing the current speed range for achieving a better road performance. The numerical simulations have been carried out using a 2D steady pressure-based    k-ɛ realizable model with enhanced wall treatment. In our numerical studies, Reynolds-averaged Navier-Stokes model and its code of solution are used. The code is calibrated and validated using the exact solution of the 2D boundary layer displacement thickness at the Sanal flow choking condition for adiabatic flows. We observed through the parametric analytical studies that the inverted airfoil integrated with the bottom surface at various predesigned locations of Hatchback cars can improve its overall aerodynamic efficiency through drag reduction, which obviously decreases the fuel consumption significantly and ensure an optimum road performance lucratively with maximum permissible speed within the framework of the manufactures constraints.

Keywords: Aerodynamics of commercial cars, downward force, hatchback car, inverted airfoil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1586
445 Artificial Neural Networks Technique for Seismic Hazard Prediction Using Seismic Bumps

Authors: Belkacem Selma, Boumediene Selma, Samira Chouraqui, Hanifi Missoum, Tourkia Guerzou

Abstract:

Natural disasters have occurred and will continue to cause human and material damage. Therefore, the idea of "preventing" natural disasters will never be possible. However, their prediction is possible with the advancement of technology. Even if natural disasters are effectively inevitable, their consequences may be partly controlled. The rapid growth and progress of artificial intelligence (AI) had a major impact on the prediction of natural disasters and risk assessment which are necessary for effective disaster reduction. Earthquake prediction to prevent the loss of human lives and even property damage is an important factor; that, is why it is crucial to develop techniques for predicting this natural disaster. This study aims to analyze the ability of artificial neural networks (ANNs) to predict earthquakes that occur in a given area. The used data describe the problem of high energy (higher than 104 J) seismic bumps forecasting in a coal mine using two long walls as an example. For this purpose, seismic bumps data obtained from mines have been analyzed. The results obtained show that the ANN is able to predict earthquake parameters with  high accuracy; the classification accuracy through neural networks is more than 94%, and the models developed are efficient and robust and depend only weakly on the initial database.

Keywords: Earthquake prediction, artificial intelligence, AI, Artificial Neural Network, ANN, seismic bumps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1130
444 Identifying Quality Islamic Content in Community Question Answering Sites

Authors: Rabia Bibi, Muhammad Shahzad Faisal, Khalid Iqbal, Atif Inayat

Abstract:

Internet is growing rapidly and new community-based content is added by people every second. With this fast-growing community-based content, if a user requires answers of particular questions, then reviews are required from experts or community. However, it is difficult to get quality answers. The Muslim community all over the world is seeking help to get their questions and issues discussed to get answers. Online web portals of religious schools and community-based question answering sites are two big platforms to solve the issues of users. In the case of religious schools, there are experts and qualified religious scholars (mufti) who can give the expert opinion. However, the quality of community-based content cannot be guaranteed as it may not be an answer that satisfies the question of a user. Users on CQA sites may include spammers or individual criticizing the questioner instead of providing useful answers. In this paper, we research strategies to naturally distinguish the right content. As an experiment, we concentrate on Yahoo! Answers, and Quora, popular online QA sites, where questions are asked, answered, edited, and organized by a large community of users. We present the classification of data to categorize both relevant and irrelevant answers. Specifically, we demonstrate that the proposed framework can isolate quality answers from the rest with an exactness near that of people.

Keywords: Community-based question and answering, evaluation and prediction of quality answer, answer classification, Islamic content, answer ranking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23
443 Teaching Math to Preschool Children with Autism

Authors: Hui Fang Huang Su, Jia Borror

Abstract:

This study compared two different interventions for math instruction among preschoolers with autism spectrum disorder (ASD). The first intervention, a combination of discrete trial teaching and Strategies for Teaching Based on Autism Research (STAR), was the regular math curriculum utilized at the preschool. The second activity-based, naturalistic intervention was Project Mind, also known as Math is Not Difficult. The curricular interventions were randomly assigned to four preschool classrooms with ASD students and implemented over three months for Project MIND. Measurements gained during the same three months for the STAR intervention were used. A quasi-experimental, pre-test/post-test design was selected to compare which intervention was the most effective in increasing mathematical knowledge and skills among preschoolers with ASD. Standardized pre and post-test instruments included the Bracken Basic Concept Scale-3 Receptive, the Applied Problems and Calculation subtests of the Woodcock-Johnson IV Tests of Achievement, and the TEMA 3: Test of Early Mathematics Ability – Third Edition. The STAR assessment is typically administered to all preschoolers at the study site three times per year, and those results were used in this study. We anticipated that the implementation of these two approaches would lead to improvement in the mathematical knowledge and skills of children with ASD. Still, it is essential to see whether a behavioral or naturalistic teaching approach leads to more significant results.

Keywords: Autism, mathematics, preschool, special education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 829
442 Applying the Extreme-Based Teaching Model in Post-Secondary Online Classroom Setting: A Field Experiment

Authors: Leon Pan

Abstract:

The first programming course within post-secondary education has long been recognized as a challenging endeavor for both educators and students alike. Historically, these courses have exhibited high failure rates and a notable number of dropouts. Instructors often lament students' lack of effort on their coursework, and students often express frustration that the teaching methods employed are not effective. Drawing inspiration from the successful principles of Extreme Programming, this study introduces an approach—the Extremes-based teaching model—aimed at enhancing the teaching of introductory programming courses. To empirically determine the effectiveness of the model, a comparison was made between a section taught using the extreme-based model and another utilizing traditional teaching methods. Notably, the extreme-based teaching class required students to work collaboratively on projects, while also demanding continuous assessment and performance enhancement within groups. This paper details the application of the extreme-based model within the post-secondary online classroom context and presents the compelling results that emphasize its effectiveness in advancing the teaching and learning experiences. The extreme-based model led to a significant increase of 13.46 points in the weighted total average and a commendable 10% reduction in the failure rate.

Keywords: Extreme-based teaching model, innovative pedagogical methods, project-based learning, team-based learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76
441 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition

Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu

Abstract:

In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.

Keywords: Biometry, image processing, pattern recognition, speech analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1925
440 Nonlinear Sensitive Control of Centrifugal Compressor

Authors: F. Laaouad, M. Bouguerra, A. Hafaifa, A. Iratni

Abstract:

In this work, we treat the problems related to chemical and petrochemical plants of a certain complex process taking the centrifugal compressor as an example, a system being very complex by its physical structure as well as its behaviour (surge phenomenon). We propose to study the application possibilities of the recent control approaches to the compressor behaviour, and consequently evaluate their contribution in the practical and theoretical fields. Facing the studied industrial process complexity, we choose to make recourse to fuzzy logic for analysis and treatment of its control problem owing to the fact that these techniques constitute the only framework in which the types of imperfect knowledge can jointly be treated (uncertainties, inaccuracies, etc..) offering suitable tools to characterise them. In the particular case of the centrifugal compressor, these imperfections are interpreted by modelling errors, the neglected dynamics, no modelisable dynamics and the parametric variations. The purpose of this paper is to produce a total robust nonlinear controller design method to stabilize the compression process at its optimum steady state by manipulating the gas rate flow. In order to cope with both the parameter uncertainty and the structured non linearity of the plant, the proposed method consists of a linear steady state regulation that ensures robust optimal control and of a nonlinear compensation that achieves the exact input/output linearization.

Keywords: Compressor, Fuzzy logic, Surge control, Bilinearcontroller, Stability analysis, Nonlinear plant.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2118
439 Meteorological Risk Assessment for Ships with Fuzzy Logic Designer

Authors: Ismail Karaca, Ridvan Saracoglu, Omer Soner

Abstract:

Fuzzy Logic, an advanced method to support decision-making, is used by various scientists in many disciplines. Fuzzy programming is a product of fuzzy logic, fuzzy rules, and implication. In marine science, fuzzy programming for ships is dramatically increasing together with autonomous ship studies. In this paper, a program to support the decision-making process for ship navigation has been designed. The program is produced in fuzzy logic and rules, by taking the marine accidents and expert opinions into account. After the program was designed, the program was tested by 46 ship accidents reported by the Transportation Safety Investigation Center of Turkey. Wind speed, sea condition, visibility, day/night ratio have been used as input data. They have been converted into a risk factor within the Fuzzy Logic Designer application and fuzzy rules set by marine experts. Finally, the expert's meteorological risk factor for each accident is compared with the program's risk factor, and the error rate was calculated. The main objective of this study is to improve the navigational safety of ships, by using the advance decision support model. According to the study result, fuzzy programming is a robust model that supports safe navigation.

Keywords: Calculation of risk factor, fuzzy logic, fuzzy programming for ship, safe navigation of ships.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 764
438 A Software-Supported Methodology for Designing General-Purpose Interconnection Networks for Reconfigurable Architectures

Authors: Kostas Siozios, Dimitrios Soudris, Antonios Thanailakis

Abstract:

Modern applications realized onto FPGAs exhibit high connectivity demands. Throughout this paper we study the routing constraints of Virtex devices and we propose a systematic methodology for designing a novel general-purpose interconnection network targeting to reconfigurable architectures. This network consists of multiple segment wires and SB patterns, appropriately selected and assigned across the device. The goal of our proposed methodology is to maximize the hardware utilization of fabricated routing resources. The derived interconnection scheme is integrated on a Virtex style FPGA. This device is characterized both for its high-performance, as well as for its low-energy requirements. Due to this, the design criterion that guides our architecture selections was the minimal Energy×Delay Product (EDP). The methodology is fully-supported by three new software tools, which belong to MEANDER Design Framework. Using a typical set of MCNC benchmarks, extensive comparison study in terms of several critical parameters proves the effectiveness of the derived interconnection network. More specifically, we achieve average Energy×Delay Product reduction by 63%, performance increase by 26%, reduction in leakage power by 21%, reduction in total energy consumption by 11%, at the expense of increase of channel width by 20%.

Keywords: Design Methodology, FPGA, Interconnection, Low-Energy, High-Performance, CAD tool.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705
437 A Review of the Antecedents and Consequences of Employee Engagementc

Authors: Ibrahim Hamidu Magem

Abstract:

Employee engagement has continued to gain popularity among practitioners, consultants and academicians recent years. This is due to the fact that the engaged employees are central to organizational success in today’s highly competitive and rapidly changing business environment. Employee engagement depicts a situation whereby employee’s harnessed themselves to their work roles. The importance of employee engagement to organizations cannot be overemphasized in today’s rapidly changing business environment. Organizations both large and small are constantly striving to improve their performance, retain employees, reduce absenteeism, and create loyal customers among others. To be able to achieve these organizations need a team of highly engaged employees. In line with this, the study attempts to provide a valuable framework for understanding the antecedents and consequences of employee engagement in organizations. The paper categorizes the antecedents of employee engagement into individual and organizational factors which it is assumed that the existence of such factors could result into engaged employees that will be of benefit to organizations. Therefore, it is recommended that organizations should revisit and redesign its employee engagement system to enable them attain their organizational goals and objectives. In addition, organizations should note that engagement is personal but organizational engagement programmes should be about everyone in the organization. The findings from this paper adds to existing studies about employee engagement and also provide awareness to academics and practitioners about the importance of employee engagement to improve organizations efficiency and effectiveness, as well as to impact to overall firm performance.

Keywords: Antecedent, employee engagement, job involvement, organization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
436 Condition Monitoring for Twin-Fluid Nozzles with Internal Mixing

Authors: C. Lanzerstorfer

Abstract:

Liquid sprays of water are frequently used in air pollution control for gas cooling purposes and for gas cleaning. Twin-fluid nozzles with internal mixing are often used for these purposes because of the small size of the drops produced. In these nozzles the liquid is dispersed by compressed air or another pressurized gas. In high efficiency scrubbers for particle separation, several nozzles are operated in parallel because of the size of the cross section. In such scrubbers, the scrubbing water has to be re-circulated. Precipitation of some solid material can occur in the liquid circuit, caused by chemical reactions. When such precipitations are detached from the place of formation, they can partly or totally block the liquid flow to a nozzle. Due to the resulting unbalanced supply of the nozzles with water and gas, the efficiency of separation decreases. Thus, the nozzles have to be cleaned if a certain fraction of blockages is reached. The aim of this study was to provide a tool for continuously monitoring the status of the nozzles of a scrubber based on the available operation data (water flow, air flow, water pressure and air pressure). The difference between the air pressure and the water pressure is not well suited for this purpose, because the difference is quite small and therefore very exact calibration of the pressure measurement would be required. Therefore, an equation for the reference air flow of a nozzle at the actual water flow and operation pressure was derived. This flow can be compared with the actual air flow for assessment of the status of the nozzles.

Keywords: Twin-fluid nozzles, operation data, condition monitoring, flow equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1139
435 A Software Framework for Predicting Oil-Palm Yield from Climate Data

Authors: Mohd. Noor Md. Sap, A. Majid Awan

Abstract:

Intelligent systems based on machine learning techniques, such as classification, clustering, are gaining wide spread popularity in real world applications. This paper presents work on developing a software system for predicting crop yield, for example oil-palm yield, from climate and plantation data. At the core of our system is a method for unsupervised partitioning of data for finding spatio-temporal patterns in climate data using kernel methods which offer strength to deal with complex data. This work gets inspiration from the notion that a non-linear data transformation into some high dimensional feature space increases the possibility of linear separability of the patterns in the transformed space. Therefore, it simplifies exploration of the associated structure in the data. Kernel methods implicitly perform a non-linear mapping of the input data into a high dimensional feature space by replacing the inner products with an appropriate positive definite function. In this paper we present a robust weighted kernel k-means algorithm incorporating spatial constraints for clustering the data. The proposed algorithm can effectively handle noise, outliers and auto-correlation in the spatial data, for effective and efficient data analysis by exploring patterns and structures in the data, and thus can be used for predicting oil-palm yield by analyzing various factors affecting the yield.

Keywords: Pattern analysis, clustering, kernel methods, spatial data, crop yield

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
434 Assessment of the Efficiency of Virtual Orthodontic Consultations during COVID-19

Authors: R. Litt, A. Brown

Abstract:

Aims: We aimed to assess the efficiency of ‘Attend Anywhere’ orthodontic clinics within a district general hospital during COVID- 19. Our secondary aim was to pilot a questionnaire to assess patient satisfaction with virtual orthodontic appointments. Design: The study design is a service evaluation including pilot questionnaire. Methods: The average number of patients seen per virtual clinic and the number of patients failing to attend was compared to face-to-face clinics. The capability of virtual appointments to be successful in preventing the need for a face-to-face appointment was assessed. Patients were invited to complete a telephone pilot questionnaire focusing on patient satisfaction and accessibility. Results: There was a small increase in the number of patients failing to attend virtual appointments, with a third of the patients who did not attend failing to receive the appointment link. 81.9% of virtual clinic appointments were successful and prevented the need for a face-to-face appointment. Overall patients were very satisfied with their virtual orthodontic appointment and the majority required no assistance to access the service. Conclusions: The use of ‘Attend Anywhere’ clinics in orthodontics offers patients and clinicians an effective and efficient alternative to face-to-face appointments that patients on average find easy to use and completely satisfactory.

Keywords: Clinics, COVID-19, orthodontics, patient satisfaction, virtual.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 606
433 A Mobile Multihop Relay Dynamic TDD Scheme for Cellular Networks

Authors: Jong-Moon Chung, Hyung-Weon Cho, Ki-Yong Jin, Min-Hee Cho

Abstract:

In this paper, we present an analytical framework for the evaluation of the uplink performance of multihop cellular networks based on dynamic time division duplex (TDD). New wireless broadband protocols, such as WiMAX, WiBro, and 3G-LTE apply TDD, and mobile communication protocols under standardization (e.g., IEEE802.16j) are investigating mobile multihop relay (MMR) as a future technology. In this paper a novel MMR TDD scheme is presented, where the dynamic range of the frame is shared to traffic resources of asymmetric nature and multihop relaying. The mobile communication channel interference model comprises of inner and co-channel interference (CCI). The performance analysis focuses on the uplink due to the fact that the effects of dynamic resource allocation show significant performance degradation only in the uplink compared to time division multiple access (TDMA) schemes due to CCI [1-3], where the downlink results to be the same or better.The analysis was based on the signal to interference power ratio (SIR) outage probability of dynamic TDD (D-TDD) and TDMA systems,which are the most widespread mobile communication multi-user control techniques. This paper presents the uplink SIR outage probability with multihop results and shows that the dynamic TDD scheme applying MMR can provide a performance improvement compared to single hop applications if executed properly.

Keywords: Co-Channel Interference, Dynamic TDD, MobileMultihop Reply, Cellular Network, Time Division Multiple Access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2325
432 Evaluation of the Internal Quality for Pineapple Based on the Spectroscopy Approach and Neural Network

Authors: Nonlapun Meenil, Pisitpong Intarapong, Thitima Wongsheree, Pranchalee Samanpiboon

Abstract:

In Thailand, once pineapples are harvested, they must be classified into two classes based on their sweetness: sweet and unsweet. This paper has studied and developed the assessment of internal quality of pineapples using a low-cost compact spectroscopy sensor according to the spectroscopy approach and Neural Network (NN). During the experiments, Batavia pineapples were utilized, generating 100 samples. The extracted pineapple juice of each sample was used to determine the Soluble Solid Content (SSC) labeling into sweet and unsweet classes. In terms of experimental equipment, the sensor cover was specifically designed to install the sensor and light source to read the reflectance at a five mm depth from pineapple flesh. By using a spectroscopy sensor, data on visible and near-infrared reflectance (Vis-NIR) were collected. The NN was used to classify the pineapple classes. Before the classification step, the preprocessing methods, which are class balancing, data shuffling, and standardization, were applied. The 510 nm and 900 nm reflectance values of the middle parts of pineapples were used as features of the NN. With the sequential model and ReLU activation function, 100% accuracy of the training set and 76.67% accuracy of the test set were achieved. According to the abovementioned information, using a low-cost compact spectroscopy sensor has achieved favorable results in classifying the sweetness of the two classes of pineapples.

Keywords: Spectroscopy, soluble solid content, pineapple, neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73
431 ECG Based Reliable User Identification Using Deep Learning

Authors: R. N. Begum, Ambalika Sharma, G. K. Singh

Abstract:

Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and electrocardiogram (ECG)-based systems are unquestionably the best choice due to their appealing inherent characteristics. The Convolutional Neural Networks (CNNs) are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the caliber of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest False Acceptance Rate (FAR)  of 0.04% and the highest False Rejection Rate (FRR)  of 5%, the best performing network achieved an identification accuracy of 99.94%. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable, but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.

Keywords: Biometrics, dense networks, identification rate, train/test split ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 499