Search results for: learning tool.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3473

Search results for: learning tool.

173 Evolution of Web Development Techniques in Modern Technology

Authors: Abdul Basit Kiani, Maryam Kiani

Abstract:

The art of web development in new technologies is a dynamic journey, shaped by the constant evolution of tools and platforms. With the emergence of JavaScript frameworks and APIs, web developers are empowered to craft web applications that are not only robust but also highly interactive. The aim is to provide an overview of the developments in the field. The integration of artificial intelligence (AI) and machine learning (ML) has opened new horizons in web development. Chatbots, intelligent recommendation systems, and personalization algorithms have become integral components of modern websites. These AI-powered features enhance user engagement, provide personalized experiences, and streamline customer support processes, revolutionizing the way businesses interact with their audiences. Lastly, the emphasis on web security and privacy has been a pivotal area of progress. With the increasing incidents of cyber threats, web developers have implemented robust security measures to safeguard user data and ensure secure transactions. Innovations such as HTTPS protocol, two-factor authentication, and advanced encryption techniques have bolstered the overall security of web applications, fostering trust and confidence among users. Hence, recent progress in web development has propelled the industry forward, enabling developers to craft innovative and immersive digital experiences. From responsive design to AI integration and enhanced security, the landscape of web development continues to evolve, promising a future filled with endless possibilities.

Keywords: Web development, software testing, progressive web apps, web and mobile native application.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 280
172 Combining the Deep Neural Network with the K-Means for Traffic Accident Prediction

Authors: Celso L. Fernando, Toshio Yoshii, Takahiro Tsubota

Abstract:

Understanding the causes of a road accident and predicting their occurrence is key to prevent deaths and serious injuries from road accident events. Traditional statistical methods such as the Poisson and the Logistics regressions have been used to find the association of the traffic environmental factors with the accident occurred; recently, an artificial neural network, ANN, a computational technique that learns from historical data to make a more accurate prediction, has emerged. Although the ability to make accurate predictions, the ANN has difficulty dealing with highly unbalanced attribute patterns distribution in the training dataset; in such circumstances, the ANN treats the minority group as noise. However, in the real world data, the minority group is often the group of interest; e.g., in the road traffic accident data, the events of the accident are the group of interest. This study proposes a combination of the k-means with the ANN to improve the predictive ability of the neural network model by alleviating the effect of the unbalanced distribution of the attribute patterns in the training dataset. The results show that the proposed method improves the ability of the neural network to make a prediction on a highly unbalanced distributed attribute patterns dataset; however, on an even distributed attribute patterns dataset, the proposed method performs almost like a standard neural network. 

Keywords: Accident risks estimation, artificial neural network, deep learning, K-mean, road safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 908
171 Health Psychology Intervention – Identifying Early Symptoms in Neurological Disorders

Authors: Simon B. N. Thompson

Abstract:

Cortisol is essential to the regulation of the immune system and pathological yawning is a symptom of multiple sclerosis (MS). Electromyography activity (EMG) in the jaw muscles typically rises when the muscles are moved – extended or flexed; and yawning has been shown to be highly correlated with cortisol levels in healthy people as shown in the Thompson Cortisol Hypothesis. It is likely that these elevated cortisol levels are also seen in people with MS. The possible link between EMG in the jaw muscles and rises in saliva cortisol levels during yawning were investigated in a randomized controlled trial of 60 volunteers aged 18-69 years who were exposed to conditions that were designed to elicit the yawning response. Saliva samples were collected at the start and after yawning, or at the end of the presentation of yawning-provoking stimuli, in the absence of a yawn, and EMG data was additionally collected during rest and yawning phases. Hospital Anxiety and Depression Scale, Yawning Susceptibility Scale, General Health Questionnaire, demographic, and health details were collected and the following exclusion criteria were adopted: chronic fatigue, diabetes, fibromyalgia, heart condition, high blood pressure, hormone replacement therapy, multiple sclerosis, and stroke. Significant differences were found between the saliva cortisol samples for the yawners, t (23) = -4.263, p = 0.000, as compared with the non-yawners between rest and poststimuli, which was non-significant. There were also significant differences between yawners and non-yawners for the EMG potentials with the yawners having higher rest and post-yawning potentials. Significant evidence was found to support the Thompson Cortisol Hypothesis suggesting that rises in cortisol levels are associated with the yawning response. Further research is underway to explore the use of cortisol as a potential diagnostic tool as an assist to the early diagnosis of symptoms related to neurological disorders. Bournemouth University Research & Ethics approval granted: JC28/1/13-KA6/9/13. Professional code of conduct, confidentiality, and safety issues have been addressed and approved in the Ethics submission. Trials identification number: ISRCTN61942768. http://www.controlled-trials.com/isrctn/

Keywords: Cortisol, Electromyography, Neurology, Yawning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
170 Numerical Model of Low Cost Rubber Isolators for Masonry Housing in High Seismic Regions

Authors: Ahmad B. Habieb, Gabriele Milani, Tavio Tavio, Federico Milani

Abstract:

Housings in developing countries have often inadequate seismic protection, particularly for masonry. People choose this type of structure since the cost and application are relatively cheap. Seismic protection of masonry remains an interesting issue among researchers. In this study, we develop a low-cost seismic isolation system for masonry using fiber reinforced elastomeric isolators. The elastomer proposed consists of few layers of rubber pads and fiber lamina, making it lower in cost comparing to the conventional isolators. We present a finite element (FE) analysis to predict the behavior of the low cost rubber isolators undergoing moderate deformations. The FE model of the elastomer involves a hyperelastic material property for the rubber pad. We adopt a Yeoh hyperelasticity model and estimate its coefficients through the available experimental data. Having the shear behavior of the elastomers, we apply that isolation system onto small masonry housing. To attach the isolators on the building, we model the shear behavior of the isolation system by means of a damped nonlinear spring model. By this attempt, the FE analysis becomes computationally inexpensive. Several ground motion data are applied to observe its sensitivity. Roof acceleration and tensile damage of walls become the parameters to evaluate the performance of the isolators. In this study, a concrete damage plasticity model is used to model masonry in the nonlinear range. This tool is available in the standard package of Abaqus FE software. Finally, the results show that the low-cost isolators proposed are capable of reducing roof acceleration and damage level of masonry housing. Through this study, we are also capable of monitoring the shear deformation of isolators during seismic motion. It is useful to determine whether the isolator is applicable. According to the results, the deformations of isolators on the benchmark one story building are relatively small.

Keywords: Masonry, low cost elastomeric isolator, finite element analysis, hyperelasticity, damped non-linear spring, concrete damage plasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1156
169 Perception of Secondary Schools’ Students on Computer Education in Federal Capital Territory (FCT-Abuja), Nigeria

Authors: Salako Emmanuel Adekunle

Abstract:

Computer education is referred to as the knowledge and ability to use computers and related technology efficiently, with a range of skills covering levels from basic use to advance. Computer continues to make an ever-increasing impact on all aspect of human endeavours such as education. With numerous benefits of computer education, what are the insights of students on computer education? This study investigated the perception of senior secondary school students on computer education in Federal Capital Territory (FCT), Abuja, Nigeria. A sample of 7500 senior secondary schools students was involved in the study, one hundred (100) private and fifty (50) public schools within FCT. They were selected by using simple random sampling technique. A questionnaire [PSSSCEQ] was developed and validated through expert judgement and reliability coefficient of 0.84 was obtained. It was used to gather relevant data on computer education. Findings confirmed that the students in the FCT had positive perception on computer education. Some factors were identified that affect students’ perception on computer education. The null hypotheses were tested using t-test and ANOVA statistical analyses at 0.05 level of significance. Based on these findings, some recommendations were made which include competent teachers should be employed into all secondary schools. This will help students to acquire relevant knowledge in computer education, technological supports should be provided to all secondary schools; this will help the users (students) to solve specific problems in computer education and financial supports should be provided to procure computer facilities that will enhance the teaching and the learning of computer education.

Keywords: Computer education, perception, secondary school, students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4036
168 Mining User-Generated Contents to Detect Service Failures with Topic Model

Authors: Kyung Bae Park, Sung Ho Ha

Abstract:

Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.

Keywords: Latent Dirichlet allocation, R program, text mining, topic model, user generated contents, visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1198
167 The Estimation of Bird Diversity Loss and Gain as an Impact of Oil Palm Plantation: Study Case in KJNP Estate Riau Province

Authors: Yanto Santosa, Catharina Yudea

Abstract:

The rapid growth of oil palm industry in Indonesia raised many negative accusations from various parties, who said that oil palm plantation is damaging the environment and biodiversity, including birds. Since research on oil palm plantation impacts on bird diversity is still limited, this study needs to be developed in order to gain further learning and understanding. Data on bird diversity were collected in March 2018 in KJNP Estate, Riau Province using strip transect method on five different land cover types (young, intermediate, and old growth of oil palm plantation, high conservation value area, and crops field or the baseline). The observations were conducted simultaneously, with three repetitions. The result shows that the baseline has 19 species of birds and land cover after the oil palm plantation has 39 species. HCV (high conservation value) area has the highest increase in diversity value. Oil palm plantation has changed the composition of bird species. The highest similarity index is shown by young growth oil palm land cover with total score 0.65, meanwhile the lowest similarity index with total score 0.43 is shown by HCV area. Overall, the existence of oil palm plantation made a positive impact by increasing bird species diversity, with total 23 species gained and 3 species lost.

Keywords: Bird diversity, crops field, impact of oil palm plantation, KJNP estate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 765
166 Outsourcing the Front End of Innovation

Authors: B. Likar, K. Širok

Abstract:

The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology" - a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.

Keywords: Creativity, distance learning, front end, innovation, problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2188
165 Assessing the Sheltering Response in the Middle East: Studying Syrian Camps in Jordan

Authors: Lara A. Alshawawreh, R. Sean Smith, John B. Wood

Abstract:

This study focuses on the sheltering response in the Middle East, specifically through reviewing two Syrian refugee camps in Jordan, involving Zaatari and Azraq. Zaatari camp involved the rapid deployment of tents and shelters over a very short period of time and Azraq was purpose built and pre-planned over a longer period. At present, both camps collectively host more than 133,000 occupants. Field visits were taken to both camps and the main issues and problems in the sheltering response were highlighted through focus group discussions with camp occupants and inspection of shelter habitats. This provided both subjective and objective research data sources. While every case has its own significance and deployment to meet humanitarian needs, there are some common requirements irrespective of geographical region. The results suggest that there is a gap in the suitability of the required habitat needs and what has been provided. It is recommended that the global international response and support could be improved in relation to the habitat form, construction type, layout, function and critically the cultural aspects. Services, health and hygiene are key elements to the shelter habitat provision. The study also identified the amendments to shelters undertaken by the beneficiaries providing insight into their key main requirements. The outcomes from this study could provide an important learning opportunity to develop improved habitat response for future shelters.

Keywords: Culture, post-disaster, refugees, shelters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1180
164 Synthesis and Characterization of ZnO and Fe3O4 Nanocrystals from Oleat-based Organometallic Compounds

Authors: PoiSim Khiew, WeeSiong Chiu, ThianKhoonTan, Shahidan Radiman, Roslan Abd-Shukor, Muhammad Azmi Abd-Hamid, ChinHua Chia

Abstract:

Magnetic and semiconductor nanomaterials exhibit novel magnetic and optical properties owing to their unique size and shape-dependent effects. With shrinking the size down to nanoscale region, various anomalous properties that normally not present in bulk start to dominate. Ability in harnessing of these anomalous properties for the design of various advance electronic devices is strictly dependent on synthetic strategies. Hence, current research has focused on developing a rational synthetic control to produce high quality nanocrystals by using organometallic approach to tune both size and shape of the nanomaterials. In order to elucidate the growth mechanism, transmission electron microscopy was employed as a powerful tool in performing real time-resolved morphologies and structural characterization of magnetic (Fe3O4) and semiconductor (ZnO) nanocrystals. The current synthetic approach is found able to produce nanostructures with well-defined shapes. We have found that oleic acid is an effective capping ligand in preparing oxide-based nanostructures without any agglomerations, even at high temperature. The oleate-based precursors and capping ligands are fatty acid compounds, which are respectively originated from natural palm oil with low toxicity. In comparison with other synthetic approaches in producing nanostructures, current synthetic method offers an effective route to produce oxide-based nanomaterials with well-defined shapes and good monodispersity. The nanocystals are well-separated with each other without any stacking effect. In addition, the as-synthesized nanopellets are stable in terms of chemically and physically if compared to those nanomaterials that are previous reported. Further development and extension of current synthetic strategy are being pursued to combine both of these materials into nanocomposite form that will be used as “smart magnetic nanophotocatalyst" for industry waste water treatment.

Keywords: Metal oxide nanomaterials, Nanophotocatalyst, Organometallic synthesis, Morphology Control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2564
163 Bridge Health Monitoring: A Review

Authors: Mohammad Bakhshandeh

Abstract:

Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.

Keywords: Structural health monitoring, bridge health monitoring, sensor-based methods, machine-learning algorithms, model-based techniques, sensor placement, data acquisition, data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 192
162 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: Artificial intelligence, computer science, criminal investigation, digital forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1184
161 Stochastic Simulation of Reaction-Diffusion Systems

Authors: Paola Lecca, Lorenzo Dematte

Abstract:

Reactiondiffusion systems are mathematical models that describe how the concentration of one or more substances distributed in space changes under the influence of local chemical reactions in which the substances are converted into each other, and diffusion which causes the substances to spread out in space. The classical representation of a reaction-diffusion system is given by semi-linear parabolic partial differential equations, whose general form is ÔêétX(x, t) = DΔX(x, t), where X(x, t) is the state vector, D is the matrix of the diffusion coefficients and Δ is the Laplace operator. If the solute move in an homogeneous system in thermal equilibrium, the diffusion coefficients are constants that do not depend on the local concentration of solvent and of solutes and on local temperature of the medium. In this paper a new stochastic reaction-diffusion model in which the diffusion coefficients are function of the local concentration, viscosity and frictional forces of solvent and solute is presented. Such a model provides a more realistic description of the molecular kinetics in non-homogenoeus and highly structured media as the intra- and inter-cellular spaces. The movement of a molecule A from a region i to a region j of the space is described as a first order reaction Ai k- → Aj , where the rate constant k depends on the diffusion coefficient. Representing the diffusional motion as a chemical reaction allows to assimilate a reaction-diffusion system to a pure reaction system and to simulate it with Gillespie-inspired stochastic simulation algorithms. The stochastic time evolution of the system is given by the occurrence of diffusion events and chemical reaction events. At each time step an event (reaction or diffusion) is selected from a probability distribution of waiting times determined by the specific speed of reaction and diffusion events. Redi is the software tool, developed to implement the model of reaction-diffusion kinetics and dynamics. It is a free software, that can be downloaded from http://www.cosbi.eu. To demonstrate the validity of the new reaction-diffusion model, the simulation results of the chaperone-assisted protein folding in cytoplasm obtained with Redi are reported. This case study is redrawing the attention of the scientific community due to current interests on protein aggregation as a potential cause for neurodegenerative diseases.

Keywords: Reaction-diffusion systems, Fick's law, stochastic simulation algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
160 Unpacking Chilean Preservice Teachers’ Beliefs on Practicum Experiences through Digital Stories

Authors: Claudio Díaz, Mabel Ortiz

Abstract:

An EFL teacher education programme in Chile takes five years to train a future teacher of English. Preservice teachers are prepared to learn an advanced level of English and teach the language from 5th to 12th grade in the Chilean educational system. In the context of their first EFL Methodology course in year four, preservice teachers have to create a five-minute digital story that starts from a critical incident they have experienced as teachers-to-be during their observations or interventions in the schools. A critical incident can be defined as a happening, a specific incident or event either observed by them or involving them. The happening sparks their thinking and may make them subsequently think differently about the particular event. When they create their digital stories, preservice teachers put technology, teaching practice and theory together to narrate a story that is complemented by still images, moving images, text, sound effects and music. The story should be told as a personal narrative, which explains the critical incident. This presentation will focus on the creation process of 50 Chilean preservice teachers’ digital stories highlighting the critical incidents they started their stories. It will also unpack preservice teachers’ beliefs and reflections when approaching their teaching practices in schools. These beliefs will be coded and categorized through content analysis to evidence preservice teachers’ most rooted conceptions about English teaching and learning in Chilean schools. The findings seem to indicate that preservice teachers’ beliefs are strongly mediated by contextual and affective factors.

Keywords: Beliefs, Digital stories, Preservice teachers, Practicum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426
159 Tracing Syrian Refugees Urban Mobilities: The Case of Egypt and Canada

Authors: N. Elgendy, N. Hussein

Abstract:

The current Syrian crisis has caused unprecedented practices of global mobility. The process of forced eviction and the resettlement of refugees could be seen through the insights of the “new mobilities paradigm”. The mobility of refugees in terms of meaning and practice is a subject that calls for further studies. There is a need for the development of an approach to human mobility to understand a practice that is turning into a phenomenon in the 21st century. This paper aims at studying, from a qualitative point of view, the process of movement within the six constituents of mobility defined as the first phase of the journey of a refugee. The second phase would include the process of settling in and re-defining the host country as new “home” to refugees. The change in the refugee state of mind and crossing the physical and mental borders from a “foreigner” to a citizen is encouraged by both the governmental policies and the local communities’ efforts to embrace these newcomers. The paper would focus on these policies of social and economic integration. The concept of integration connotes the idea that refugees would enjoy the opportunities, rights and services available to the citizens of the refugee’s new community. So, this paper examines this concept through showcasing the two hosting countries of Canada and Egypt, as they provide two contrasting situations in terms of cultural, geographical, economic and political backgrounds. The analysis would highlight the specific policies defined towards the refugees including the mass communication, media calls, and access to employment. This research is part of a qualitative research project on the process of Urban Mobility practiced by the Syrian Refugees, drawing on conversational interviews with new-settlers who have moved to the different hosting countries, from their home in Syria. It explores these immigrants’ practical and emotional relationships with the process of movement and settlement. It uses the conversational interviews as a tool to document analysis and draw relationships in an attempt to establish an understanding of the factors that contribute to the new-settlers feeling of home and integration within the new community.

Keywords: Mobility, refugees, home, integration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1281
158 Rank-Based Chain-Mode Ensemble for Binary Classification

Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu

Abstract:

In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.

Keywords: Consensus, curse of correlation, imbalanced classification, rank-based chain-mode ensemble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 698
157 Cross Signal Identification for PSG Applications

Authors: Carmen Grigoraş, Victor Grigoraş, Daniela Boişteanu

Abstract:

The standard investigational method for obstructive sleep apnea syndrome (OSAS) diagnosis is polysomnography (PSG), which consists of a simultaneous, usually overnight recording of multiple electro-physiological signals related to sleep and wakefulness. This is an expensive, encumbering and not a readily repeated protocol, and therefore there is need for simpler and easily implemented screening and detection techniques. Identification of apnea/hypopnea events in the screening recordings is the key factor for the diagnosis of OSAS. The analysis of a solely single-lead electrocardiographic (ECG) signal for OSAS diagnosis, which may be done with portable devices, at patient-s home, is the challenge of the last years. A novel artificial neural network (ANN) based approach for feature extraction and automatic identification of respiratory events in ECG signals is presented in this paper. A nonlinear principal component analysis (NLPCA) method was considered for feature extraction and support vector machine for classification/recognition. An alternative representation of the respiratory events by means of Kohonen type neural network is discussed. Our prospective study was based on OSAS patients of the Clinical Hospital of Pneumology from Iaşi, Romania, males and females, as well as on non-OSAS investigated human subjects. Our computed analysis includes a learning phase based on cross signal PSG annotation.

Keywords: Artificial neural networks, feature extraction, obstructive sleep apnea syndrome, pattern recognition, signalprocessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
156 Gammarus:Asellus Ratio as an Index of Organic Pollution – (A Case Study in Markeaton, Kedleston Hall, and Allestree Park Lakes Derby) UK

Authors: U. Bawa

Abstract:

Macro invertebrates have been used to monitor organic pollution in rivers and streams. Several biotic indices based on macro invertebrates have been developed over the years including the Biological Monitoring Working Party (BMWP). A new biotic index, the Gammarus:Asellus ratio has been recently proposed as an index of organic pollution. This study tested the validity of the Gammarus:Asellus ratio as an index of organic pollution, by examining the relationship between the Gammarus:Asellus ratio and physical chemical parameters, and other biotic indices such as BMWP and, Average Score Per Taxon (ASPT) from lakes and streams at Markeaton Park, Allestree Park and Kedleston Hall, Derbyshire. Macro invertebrates were sampled using the standard five minute kick sampling techniques physical and chemical environmental variables were obtained based on standard sampling techniques. Eighteen sites were sampled, six sites from Markeaton Park (three sites across the stream and three sites across the lake). Six sites each were also sampled from Allestree Park and Kedleston Hall lakes. The Gammarus:Asellus ratio showed an opposite significant positive correlations with parameters indicative of organic pollution such as the level of nitrates, phosphates, and calcium and also revealed a negatively significant correlations with other biotic indices (BMWP/ASPT). The BMWP score correlated positively significantly with some water quality parameters such as dissolved oxygen and flow rate, but revealed no correlations with other chemical environmental variables. The BMWP score was significantly higher in the stream than the lake in Markeaton Park, also The ASPT scores appear to be significantly higher in the upper Lakes than the middle and lower lakes. This study has further strengthened the use of BMWP/ASPT score as an index of organic pollution. But additional application is required to validate the use of Gammarus:Asellus as a rapid bio monitoring tool.

Keywords: Asellus, Biotic index, Gammarus, Organic pollution, Macro invertebrate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2898
155 Improving Topic Quality of Scripts by Using Scene Similarity Based Word Co-Occurrence

Authors: Yunseok Noh, Chang-Uk Kwak, Sun-Joong Kim, Seong-Bae Park

Abstract:

Scripts are one of the basic text resources to understand broadcasting contents. Topic modeling is the method to get the summary of the broadcasting contents from its scripts. Generally, scripts represent contents descriptively with directions and speeches, and provide scene segments that can be seen as semantic units. Therefore, a script can be topic modeled by treating a scene segment as a document. Because scene segments consist of speeches mainly, however, relatively small co-occurrences among words in the scene segments are observed. This causes inevitably the bad quality of topics by statistical learning method. To tackle this problem, we propose a method to improve topic quality with additional word co-occurrence information obtained using scene similarities. The main idea of improving topic quality is that the information that two or more texts are topically related can be useful to learn high quality of topics. In addition, more accurate topical representations lead to get information more accurate whether two texts are related or not. In this paper, we regard two scene segments are related if their topical similarity is high enough. We also consider that words are co-occurred if they are in topically related scene segments together. By iteratively inferring topics and determining semantically neighborhood scene segments, we draw a topic space represents broadcasting contents well. In the experiments, we showed the proposed method generates a higher quality of topics from Korean drama scripts than the baselines.

Keywords: Broadcasting contents, generalized P´olya urn model, scripts, text similarity, topic model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
154 Effective Internal Control System in the Nasarawa State Tertiary Educational Institutions for Efficiency: A Case of Nasarawa State Polytechnic, Lafia

Authors: Ibrahim Dauda Adagye

Abstract:

Effective internal control system in the bursary unit of tertiary educational institutions is geared toward achieving quality teaching, learning and research environment and as well assist the management of the institutions, particularly when decisions are to be made. While internal control system exists in all institutions, the outlined objectives above are far from being achieved. The paper therefore assesses the effectiveness of internal control system in tertiary educational institutions in Nasarawa State, Nigeria with specific focus on the Nasarawa State Polytechnic, Lafia. The study is survey, hence a simple closed ended questionnaire was developed and administered to a sample of twenty seven (27) member staff from the Bursary and the Internal audit unit of the Nasarawa State Polytechnic, Lafia so as to obtain data for analysis purposes and to test the study hypothesis. Responses from the questionnaire were analysed using a simple percentage and chi square. Findings shows that the right people are not assigned to the right job in the department, budget, and management accounting were never used in the institution’s operations and checking of subordinate by their superior officers is not regular. This renders the current internal control structure of the Polytechnic as ineffective and weak. The paper therefore recommends that: transparency should be seen as significant, as the institution work toward meeting its objectives, it therefore means that the right staff be assigned the right job and regular checking of the subordinates by their superiors be ensued.

Keywords: Bursary unit, efficiency, Internal control, tertiary educational institutions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3864
153 Influential Parameters in Estimating Soil Properties from Cone Penetrating Test: An Artificial Neural Network Study

Authors: Ahmed G. Mahgoub, Dahlia H. Hafez, Mostafa A. Abu Kiefa

Abstract:

The Cone Penetration Test (CPT) is a common in-situ test which generally investigates a much greater volume of soil more quickly than possible from sampling and laboratory tests. Therefore, it has the potential to realize both cost savings and assessment of soil properties rapidly and continuously. The principle objective of this paper is to demonstrate the feasibility and efficiency of using artificial neural networks (ANNs) to predict the soil angle of internal friction (Φ) and the soil modulus of elasticity (E) from CPT results considering the uncertainties and non-linearities of the soil. In addition, ANNs are used to study the influence of different parameters and recommend which parameters should be included as input parameters to improve the prediction. Neural networks discover relationships in the input data sets through the iterative presentation of the data and intrinsic mapping characteristics of neural topologies. General Regression Neural Network (GRNN) is one of the powerful neural network architectures which is utilized in this study. A large amount of field and experimental data including CPT results, plate load tests, direct shear box, grain size distribution and calculated data of overburden pressure was obtained from a large project in the United Arab Emirates. This data was used for the training and the validation of the neural network. A comparison was made between the obtained results from the ANN's approach, and some common traditional correlations that predict Φ and E from CPT results with respect to the actual results of the collected data. The results show that the ANN is a very powerful tool. Very good agreement was obtained between estimated results from ANN and actual measured results with comparison to other correlations available in the literature. The study recommends some easily available parameters that should be included in the estimation of the soil properties to improve the prediction models. It is shown that the use of friction ration in the estimation of Φ and the use of fines content in the estimation of E considerable improve the prediction models.

Keywords: Angle of internal friction, Cone penetrating test, General regression neural network, Soil modulus of elasticity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2259
152 Simulation and Parameterization by the Finite Element Method of a C Shape Delectromagnet for Application in the Characterization of Magnetic Properties of Materials

Authors: A. A Velásquez, J.Baena

Abstract:

This article presents the simulation, parameterization and optimization of an electromagnet with the C–shaped configuration, intended for the study of magnetic properties of materials. The electromagnet studied consists of a C-shaped yoke, which provides self–shielding for minimizing losses of magnetic flux density, two poles of high magnetic permeability and power coils wound on the poles. The main physical variable studied was the static magnetic flux density in a column within the gap between the poles, with 4cm2 of square cross section and a length of 5cm, seeking a suitable set of parameters that allow us to achieve a uniform magnetic flux density of 1x104 Gaussor values above this in the column, when the system operates at room temperature and with a current consumption not exceeding 5A. By means of a magnetostatic analysis by the finite element method, the magnetic flux density and the distribution of the magnetic field lines were visualized and quantified. From the results obtained by simulating an initial configuration of electromagnet, a structural optimization of the geometry of the adjustable caps for the ends of the poles was performed. The magnetic permeability effect of the soft magnetic materials used in the poles system, such as low– carbon steel (0.08% C), Permalloy (45% Ni, 54.7% Fe) and Mumetal (21.2% Fe, 78.5% Ni), was also evaluated. The intensity and uniformity of the magnetic field in the gap showed a high dependence with the factors described above. The magnetic field achieved in the column was uniform and its magnitude ranged between 1.5x104 Gauss and 1.9x104 Gauss according to the material of the pole used, with the possibility of increasing the magnetic field by choosing a suitable geometry of the cap, introducing a cooling system for the coils and adjusting the spacing between the poles. This makes the device a versatile and scalable tool to generate the magnetic field necessary to perform magnetic characterization of materials by techniques such as vibrating sample magnetometry (VSM), Hall-effect, Kerr-effect magnetometry, among others. Additionally, a CAD design of the modules of the electromagnet is presented in order to facilitate the construction and scaling of the physical device.

Keywords: Electromagnet, Finite Elements Method, Magnetostatic, Magnetometry, Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
151 Information Filtering using Index Word Selection based on the Topics

Authors: Takeru YOKOI, Hidekazu YANAGIMOTO, Sigeru OMATU

Abstract:

We have proposed an information filtering system using index word selection from a document set based on the topics included in a set of documents. This method narrows down the particularly characteristic words in a document set and the topics are obtained by Sparse Non-negative Matrix Factorization. In information filtering, a document is often represented with the vector in which the elements correspond to the weight of the index words, and the dimension of the vector becomes larger as the number of documents is increased. Therefore, it is possible that useless words as index words for the information filtering are included. In order to address the problem, the dimension needs to be reduced. Our proposal reduces the dimension by selecting index words based on the topics included in a document set. We have applied the Sparse Non-negative Matrix Factorization to the document set to obtain these topics. The filtering is carried out based on a centroid of the learning document set. The centroid is regarded as the user-s interest. In addition, the centroid is represented with a document vector whose elements consist of the weight of the selected index words. Using the English test collection MEDLINE, thus, we confirm the effectiveness of our proposal. Hence, our proposed selection can confirm the improvement of the recommendation accuracy from the other previous methods when selecting the appropriate number of index words. In addition, we discussed the selected index words by our proposal and we found our proposal was able to select the index words covered some minor topics included in the document set.

Keywords: Information Filtering, Sparse NMF, Index wordSelection, User Profile, Chi-squared Measure

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1435
150 Being a Lay Partner in Jesuit Higher Education in the Philippines: A Grounded Theory Application

Authors: Janet B. Badong-Badilla

Abstract:

In Jesuit universities, laypersons, who come from the same or different faith backgrounds or traditions, are considered as collaborators in mission. The Jesuits themselves support the contributions of the lay partners in realizing the mission of the Society of Jesus and recognize the important role that they play in education. This study aims to investigate and generate particular notions and understandings of lived experiences of being a lay partner in Jesuit universities in the Philippines, particularly those involved in higher education. Using the qualitative approach as introduced by grounded theorist Barney Glaser, the lay partners’ concept of being a partner, as lived in higher education, is generated systematically from the data collected in the field primarily through in-depth interviews, field notes and observations. Glaser’s constant comparative method of analysis of data is used going through the phases of open coding, theoretical coding, and selective coding from memoing to theoretical sampling to sorting and then writing. In this study, Glaser’s grounded theory as a methodology will provide a substantial insight into and articulation of the layperson’s actual experience of being a partner of the Jesuits in education. Such articulation provides a phenomenological approach or framework to an understanding of the meaning and core characteristics of Jesuit-Lay partnership in Jesuit educational institution of higher learning in the country. This study is expected to provide a framework or model for lay partnership in academic institutions that have the same practice of having lay partners in mission.

Keywords: Grounded theory, Jesuit mission in higher education, lay partner, lived experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1038
149 Identifying Teachers’ Perception of Integrity in School-Based Assessment Practice: A Case Study

Authors: Abd Aziz Bin Abd Shukor, Eftah Binti Moh Hj Abdullah

Abstract:

This case study aims to identify teachers’ perception as regards integrity in School-Ba sed Assessment (PBS) practice. This descriptive study involved 9 teachers from 4 secondary schools in 3 districts in the state of Perak. The respondents had undergone an integrity in PBS Practice interview using a focused group discussion method. The overall findings showed that the teachers believed that integrity in PBS practice could be achieved by adjusting the teaching methods align with learning objectives and the students’ characteristics. Many teachers, parents and student did not understand the best practice of PBS. This would affect the integrity in PBS practice. Teachers did not emphasis the principles and ethics. Their integrity as an innovative public servant may also be affected with the frequently changing assessment system, lack of training and no prior action research. The analysis of findings showed that the teachers viewed that organizational integrity involving the integrity of PBS was difficult to be implemented based on the expectations determined by Malaysia Ministry of Education (KPM). A few elements which assisted in the achievement of PBS integrity were the training, students’ understanding, the parents’ understanding of PBS, environment (involving human resources such as support and appreciation and non-human resources such as technology infrastructure readiness and media). The implications of this study show that teachers, as the PBS implementers, have a strong influence on the integrity of PBS. However, the transformation of behavior involving PBS integrity among teachers requires the stabilisation of support and infrastructure in order to enable the teachers to implement PBS in an ethical manner.

Keywords: Assessment integrity, integrity, perception, school-based assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
148 Teaching Ethical Behaviour: Conversational Analysis in Perspective

Authors: Nikhil Kewal Krishna Mehta

Abstract:

In the past researchers have questioned the effectiveness of ethics training in higher education. Also, there are observations that support the view that ethical behaviour (range of actions)/ethical decision making models used in the past make use of vignettes to explain ethical behaviour. The understanding remains in the perspective that these vignettes play a limited role in determining individual intentions and not actions. Some authors have also agreed that there are possibilities of differences in one’s intentions and actions. This paper makes an attempt to fill those gaps by evaluating real actions rather than intentions. In a way this study suggests the use of an experiential methodology to explore Berlo’s model of communication as an action along with orchestration of various principles. To this endeavor, an attempt was made to use conversational analysis in the pursuance of evaluating ethical decision making behaviour among students and middle level managers. The process was repeated six times with the set of an average of 15 participants. Similarities have been observed in the behaviour of students and middle level managers that calls for understanding that both the groups of individuals have no cognizance of their actual actions. The deliberations derived out of conversation were taken a step forward for meta-ethical evaluations to portray a clear picture of ethical behaviour among participants. This study provides insights for understanding demonstrated unconscious human behaviour which may fortuitously be termed both ethical and unethical.

Keywords: Berlo’s action model of communication, Conversational Analysis, Ethical behaviour, Ethical decision making, experiential learning, Intentions and Actions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2525
147 Monetary Evaluation of Dispatching Decisions in Consideration of Mode Choice Models

Authors: Marcel Schneider, Nils Nießen

Abstract:

Microscopic simulation tool kits allow for consideration of the two processes of railway operations and the previous timetable production. Block occupation conflicts on both process levels are often solved by using defined train priorities. These conflict resolutions (dispatching decisions) generate reactionary delays to the involved trains. The sum of reactionary delays is commonly used to evaluate the quality of railway operations, which describes the timetable robustness. It is either compared to an acceptable train performance or the delays are appraised economically by linear monetary functions. It is impossible to adequately evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for the evaluation of dispatching decisions. The approach uses mode choice models and considers the behaviour of the end-customers. These models evaluate the reactionary delays in more detail and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operations with the macroscopic choice mode model. At first, it will be implemented for railway operations process but it can also be used for timetable production. The evaluation considers the possibility for the customer to interchange to other transport modes. The new approach starts to look at rail and road, but it can also be extended to air travel. The result of mode choice models is the modal split. The reactions by the end-customers have an impact on the revenue of the train operating companies. Different purposes of travel have different payment reserves and tolerances towards late running. Aside from changes to revenues, longer journey times can also generate additional costs. The costs are either time- or track-specific and arise from required changes to rolling stock or train crew cycles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of delays. The contribution margin is calculated for different possible solutions to the same conflict. The conflict resolution is optimised until the monetary loss becomes minimal. The iterative process therefore determines an optimum conflict resolution by monitoring the change to the contribution margin. Furthermore, a monetary value of each dispatching decision can also be derived.

Keywords: Choice of mode, monetary evaluation, railway operations, reactionary delays.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
146 The Effect of Motor Learning Based Computer-Assisted Practice for Children with Handwriting Deficit – Comparing with the Effect of Traditional Sensorimotor Approach

Authors: Shao-Hsia Chang, Nan-Ying Yu

Abstract:

The objective of this study was to test how advanced digital technology enables a more effective training on the handwriting of children with handwriting deficit. This study implemented the graphomotor apparatuses to a computer-assisted instruction system. In a randomized controlled trial, the experiments for verifying the intervention effect were conducted. Forty two children with handwriting deficit were assigned to computer-assisted instruction, sensorimotor training or control (no intervention) group. Handwriting performance was measured using the Elementary reading/writing test and computerized handwriting evaluation before and after 6 weeks of intervention. Analysis of variance of change scores were conducted to show whether statistically significant difference across the three groups. Significant difference was found among three groups. Computer group shows significant difference from the other two groups. Significance was denoted in near-point, far-point copy, dictation test, and writing from phonetic symbols. Writing speed and mean stroke velocity in near-, far-point and short paragraph copy were found significantly difference among three groups. Computer group shows significant improvement from the other groups. For clinicians and school teachers, the results of this study provide a motor control based insight for the improvement of handwriting difficulties.

Keywords: Dysgraphia, computerized handwriting evaluation, sensorimotor program, computer assisted program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2053
145 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: Natural language processing, end user development; natural language interfaces, human computer interaction, data recognition, dialog systems, spreadsheet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1102
144 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: Computer-aided system, detection, image segmentation, morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 512