Search results for: learning paradigm
853 AI-Driven Cloud Security: Proactive Defense Against Evolving Cyber Threats
Authors: Ashly Joseph
Abstract:
Cloud computing has become an essential component of enterprises and organizations globally in the current era of digital technology. The cloud has a multitude of advantages, including scalability, flexibility, and cost-effectiveness, rendering it an appealing choice for data storage and processing. The increasing storage of sensitive information in cloud environments has raised significant concerns over the security of such systems. The frequency of cyber threats and attacks specifically aimed at cloud infrastructure has been increasing, presenting substantial dangers to the data, reputation, and financial stability of enterprises. Conventional security methods can become inadequate when confronted with ever intricate and dynamic threats. Artificial Intelligence (AI) technologies possess the capacity to significantly transform cloud security through their ability to promptly identify and thwart assaults, adjust to emerging risks, and offer intelligent perspectives for proactive security actions. The objective of this research study is to investigate the utilization of AI technologies in augmenting the security measures within cloud computing systems. This paper aims to offer significant insights and recommendations for businesses seeking to protect their cloud-based assets by analyzing the present state of cloud security, the capabilities of AI, and the possible advantages and obstacles associated with using AI into cloud security policies.
Keywords: Machine Learning, Natural Learning Processing, Denial-of-Service attacks, Sentiment Analysis, Cloud computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 223852 Collocation Errors in English as Second Language (ESL) Essay Writing
Authors: Fatima Muhammad Shitu
Abstract:
In language learning, second language learners as well as Native speakers commit errors in their attempt to achieve competence in the target language. The realm of collocation has to do with meaning relation between lexical items. In all human language, there is a kind of ‘natural order’ in which words are arranged or relate to one another in sentences so much so that when a word occurs in a given context, the related or naturally co-occurring word will automatically come to the mind. It becomes an error, therefore, if students inappropriately pair or arrange such ‘naturally’ co–occurring lexical items in a text. It has been observed that most of the second language learners in this research group commit collocation errors. A study of this kind is very significant as it gives insight into the kinds of errors committed by learners. This will help the language teacher to be able to identify the sources and causes of such errors as well as correct them thereby guiding, helping and leading the learners towards achieving some level of competence in the language. The aim of the study is to understand the nature of these errors as stumbling blocks to effective essay writing. The objective of the study is to identify the errors, analyze their structural compositions so as to determine whether there are similarities between students in this regard and to find out whether there are patterns to these kinds of errors which will enable the researcher to understand their sources and causes. As a descriptive research, the researcher samples some nine hundred essays collected from three hundred undergraduate learners of English as a second language in the Federal College of Education, Kano, North- West Nigeria, i.e. three essays per each student. The essays which were given on three different lecture times were of similar thematic preoccupations (i.e. same topics) and length (i.e. same number of words). The essays were written during the lecture hour at three different lecture occasions. The errors were identified in a systematic manner whereby errors so identified were recorded only once even if they occur severally in students’ essays. The data was collated using percentages in which the identified numbers of occurrences were converted accordingly in percentages. The findings from the study indicate that there are similarities as well as regular and repeated errors which provided a pattern. Based on the pattern identified, the conclusion is that students’ collocation errors are attributable to poor teaching and learning which resulted in wrong generalization of rules.
Keywords: Collocations, errors, collocation errors, second language learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7913851 A TIPSO-SVM Expert System for Efficient Classification of TSTO Surrogates
Authors: Ali Sarosh, Dong Yun-Feng, Muhammad Umer
Abstract:
Fully reusable spaceplanes do not exist as yet. This implies that design-qualification for optimized highly-integrated forebody-inlet configuration of booster-stage vehicle cannot be based on archival data of other spaceplanes. Therefore, this paper proposes a novel TIPSO-SVM expert system methodology. A non-trivial problem related to optimization and classification of hypersonic forebody-inlet configuration in conjunction with mass-model of the two-stage-to-orbit (TSTO) vehicle is solved. The hybrid-heuristic machine learning methodology is based on two-step improved particle swarm optimizer (TIPSO) algorithm and two-step support vector machine (SVM) data classification method. The efficacy of method is tested by first evolving an optimal configuration for hypersonic compression system using TIPSO algorithm; thereafter, classifying the results using two-step SVM method. In the first step extensive but non-classified mass-model training data for multiple optimized configurations is segregated and pre-classified for learning of SVM algorithm. In second step the TIPSO optimized mass-model data is classified using the SVM classification. Results showed remarkable improvement in configuration and mass-model along with sizing parameters.
Keywords: TIPSO-SVM expert system, TIPSO algorithm, two-step SVM method, aerothermodynamics, mass-modeling, TSTO vehicle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2322850 Performance Analysis of Evolutionary ANN for Output Prediction of a Grid-Connected Photovoltaic System
Authors: S.I Sulaiman, T.K Abdul Rahman, I. Musirin, S. Shaari
Abstract:
This paper presents performance analysis of the Evolutionary Programming-Artificial Neural Network (EPANN) based technique to optimize the architecture and training parameters of a one-hidden layer feedforward ANN model for the prediction of energy output from a grid connected photovoltaic system. The ANN utilizes solar radiation and ambient temperature as its inputs while the output is the total watt-hour energy produced from the grid-connected PV system. EP is used to optimize the regression performance of the ANN model by determining the optimum values for the number of nodes in the hidden layer as well as the optimal momentum rate and learning rate for the training. The EPANN model is tested using two types of transfer function for the hidden layer, namely the tangent sigmoid and logarithmic sigmoid. The best transfer function, neural topology and learning parameters were selected based on the highest regression performance obtained during the ANN training and testing process. It is observed that the best transfer function configuration for the prediction model is [logarithmic sigmoid, purely linear].Keywords: Artificial neural network (ANN), Correlation coefficient (R), Evolutionary programming-ANN (EPANN), Photovoltaic (PV), logarithmic sigmoid and tangent sigmoid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903849 Metal Ship and Robotic Car: A Hands-On Activity to Develop Scientific and Engineering Skills for High School Students
Authors: Jutharat Sunprasert, Ekapong Hirunsirisawat, Narongrit Waraporn, Somporn Peansukmanee
Abstract:
Metal Ship and Robotic Car is one of the hands-on activities in the course, the Fundamental of Engineering that can be divided into three parts. The first part, the metal ships, was made by using engineering drawings, physics and mathematics knowledge. The second part is where the students learned how to construct a robotic car and control it using computer programming. In the last part, the students had to combine the workings of these two objects in the final testing. This aim of study was to investigate the effectiveness of hands-on activity by integrating Science, Technology, Engineering and Mathematics (STEM) concepts to develop scientific and engineering skills. The results showed that the majority of students felt this hands-on activity lead to an increased confidence level in the integration of STEM. Moreover, 48% of all students engaged well with the STEM concepts. Students could obtain the knowledge of STEM through hands-on activities with the topics science and mathematics, engineering drawing, engineering workshop and computer programming; most students agree and strongly agree with this learning process. This indicated that the hands-on activity: “Metal Ship and Robotic Car” is a useful tool to integrate each aspect of STEM. Furthermore, hands-on activities positively influence a student’s interest which leads to increased learning achievement and also in developing scientific and engineering skills.
Keywords: Hands-on activity, STEM education, computer programming, metal work.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 976848 Performance Comparison of Situation-Aware Models for Activating Robot Vacuum Cleaner in a Smart Home
Authors: Seongcheol Kwon, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.Keywords: Situation-awareness, Smart home, IoT, Machine learning, Classifier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863847 Integrating Microcontroller-Based Projects in a Human-Computer Interaction Course
Authors: Miguel Angel Garcia-Ruiz, Pedro Cesar Santana-Mancilla, Laura Sanely Gaytan-Lugo
Abstract:
This paper describes the design and application of a short in-class project conducted in Algoma University’s Human-Computer Interaction (HCI) course taught at the Bachelor of Computer Science. The project was based on the Maker Movement (people using and reusing electronic components and everyday materials to tinker with technology and make interactive applications), where students applied low-cost and easy-to-use electronic components, the Arduino Uno microcontroller board, software tools, and everyday objects. Students collaborated in small teams by completing hands-on activities with them, making an interactive walking cane for blind people. At the end of the course, students filled out a Technology Acceptance Model version 2 (TAM2) questionnaire where they evaluated microcontroller boards’ applications in HCI classes. We also asked them about applying the Maker Movement in HCI classes. Results showed overall students’ positive opinions and response about using microcontroller boards in HCI classes. We strongly suggest that every HCI course should include practical activities related to tinkering with technology such as applying microcontroller boards, where students actively and constructively participate in teams for achieving learning objectives.
Keywords: Maker movement, microcontrollers, learning, projects, course, technology acceptance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864846 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: Cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 981845 User-Perceived Quality Factors for Certification Model of Web-Based System
Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh
Abstract:
One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.
Keywords: Software certification model, user centric approach, software quality factors, metrics and measurements, web-based system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2151844 Optimum Design of Steel Space Frames by Hybrid Teaching-Learning Based Optimization and Harmony Search Algorithms
Authors: Alper Akın, İbrahim Aydoğdu
Abstract:
This study presents a hybrid metaheuristic algorithm to obtain optimum designs for steel space buildings. The optimum design problem of three-dimensional steel frames is mathematically formulated according to provisions of LRFD-AISC (Load and Resistance factor design of American Institute of Steel Construction). Design constraints such as the strength requirements of structural members, the displacement limitations, the inter-story drift and the other structural constraints are derived from LRFD-AISC specification. In this study, a hybrid algorithm by using teachinglearning based optimization (TLBO) and harmony search (HS) algorithms is employed to solve the stated optimum design problem. These algorithms are two of the recent additions to metaheuristic techniques of numerical optimization and have been an efficient tool for solving discrete programming problems. Using these two algorithms in collaboration creates a more powerful tool and mitigates each other’s weaknesses. To demonstrate the powerful performance of presented hybrid algorithm, the optimum design of a large scale steel building is presented and the results are compared to the previously obtained results available in the literature.Keywords: Optimum structural design, hybrid techniques, teaching-learning based optimization, harmony search algorithm, minimum weight, steel space frame.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2453843 Implementing a Visual Servoing System for Robot Controlling
Authors: Maryam Vafadar, Alireza Behrad, Saeed Akbari
Abstract:
Nowadays, with the emerging of the new applications like robot control in image processing, artificial vision for visual servoing is a rapidly growing discipline and Human-machine interaction plays a significant role for controlling the robot. This paper presents a new algorithm based on spatio-temporal volumes for visual servoing aims to control robots. In this algorithm, after applying necessary pre-processing on video frames, a spatio-temporal volume is constructed for each gesture and feature vector is extracted. These volumes are then analyzed for matching in two consecutive stages. For hand gesture recognition and classification we tested different classifiers including k-Nearest neighbor, learning vector quantization and back propagation neural networks. We tested the proposed algorithm with the collected data set and results showed the correct gesture recognition rate of 99.58 percent. We also tested the algorithm with noisy images and algorithm showed the correct recognition rate of 97.92 percent in noisy images.Keywords: Back propagation neural network, Feature vector, Hand gesture recognition, k-Nearest Neighbor, Learning vector quantization neural network, Robot control, Spatio-temporal volume, Visual servoing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672842 Quality of Life Assessment across the Cancer Continuum: Understanding the Role of an Exercise Rehabilitation Programme
Authors: Bernat-Carles Serdà Ferrer, Arantza Del Valle Gómez
Abstract:
The Quality of Life (QoL) paradigm is multidimensional, dynamic and modular and its definition differs across the cancer continuum. The challenge in the interpretation of QoL data in clinical research is that QoL is influenced by psychological phenomena such as adaptation to illness. This research aims to obtain a valid and sensitive assessment of QoL change over the continuum disease, and to evaluate a rehabilitation programme aimed at inverting the observed decrease in QoL when patients return to daily living activities. The sample comprised 66 men. Patients were first assessed to establish a baseline (P1-diagnosis). This was followed by a post-test (P2-discharge) and a then-test measurement (P3-retrospective evaluation) and after returning home patients were randomized in experimental and control groups. The experimental group attended a rehabilitation programme over 24 weeks (P4). Results show that from baseline to post-test, QoL decreased significantly. The recalibration then-test confirmed a low QoL in all periods evaluated. Significant differences between the experimental and control groups prove the positive effect of the Exercise Rehabilitation Programme (ERP) on QoL. Understanding the real dynamic of QoL over time would help to adapt rehabilitation programmes by improving sensitivity and efficacy and provide professionals with a more accurate perception of the impact of treatment and side effects on patients’ QoL. Our results underline the importance of changing the approach adopted by health professionals towards one of watchful waiting on patients’ QoL until their complete recovery in daily life.
Keywords: Prostate cancer, quality of life, rehabilitation programme, response shift.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147841 Classification of Potential Biomarkers in Breast Cancer Using Artificial Intelligence Algorithms and Anthropometric Datasets
Authors: Aref Aasi, Sahar Ebrahimi Bajgani, Erfan Aasi
Abstract:
Breast cancer (BC) continues to be the most frequent cancer in females and causes the highest number of cancer-related deaths in women worldwide. Inspired by recent advances in studying the relationship between different patient attributes and features and the disease, in this paper, we have tried to investigate the different classification methods for better diagnosis of BC in the early stages. In this regard, datasets from the University Hospital Centre of Coimbra were chosen, and different machine learning (ML)-based and neural network (NN) classifiers have been studied. For this purpose, we have selected favorable features among the nine provided attributes from the clinical dataset by using a random forest algorithm. This dataset consists of both healthy controls and BC patients, and it was noted that glucose, BMI, resistin, and age have the most importance, respectively. Moreover, we have analyzed these features with various ML-based classifier methods, including Decision Tree (DT), K-Nearest Neighbors (KNN), eXtreme Gradient Boosting (XGBoost), Logistic Regression (LR), Naive Bayes (NB), and Support Vector Machine (SVM) along with NN-based Multi-Layer Perceptron (MLP) classifier. The results revealed that among different techniques, the SVM and MLP classifiers have the most accuracy, with amounts of 96% and 92%, respectively. These results divulged that the adopted procedure could be used effectively for the classification of cancer cells, and also it encourages further experimental investigations with more collected data for other types of cancers.
Keywords: Breast cancer, health diagnosis, Machine Learning, biomarker classification, Neural Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 329840 Criminal Justice System, Health and Imprisonment in India
Authors: Debolina Chatterjee, Suhita Chopra Chatterjee
Abstract:
Imprisonment is an expansive concept, as it is regulated by laws under criminal justice system of the state. The state sets principles of punishment to control offenders and also puts limits to excess punitive control. One significant way through which it exercises control is through rules governing healthcare of imprisoned population. Prisons signify specialized settings which accommodate both medical and legal concerns. The provision of care operates within the institutional paradigm of punishment. This requires the state to negotiate adequately between goals of punishment and fulfilment of basic human rights of offenders. The present study is based on a critical analysis of prison healthcare standards in India, which include government policies and guidelines. It also demonstrates how healthcare is delivered by drawing insights from a primary study conducted in a correctional home in the state of West Bengal, India, which houses both male and female inmates. Forty women were interviewed through semi-structured interviews, followed by focus group discussions. Doctors and administrative personnel were also interviewed. Findings show how institutional practices control women through subversion of the role of doctors to prison administration. Also, poor healthcare infrastructure, unavailability of specialized services, hierarchies between personnel and inmates make prisons unlikely sites for therapeutic intervention. The paper further discusses how institutional practices foster gender-based discriminatory practices.Keywords: Imprisonment, imprisoned women, prison healthcare, prison policies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1293839 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering
Authors: Sharifah Mousli, Sona Taheri, Jiayuan He
Abstract:
Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD, as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches, such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.
Keywords: Autism spectrum disorder, clustering, optimization, unsupervised machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 448838 The Discovery and Application of Perspective Representation in Modern Italy
Authors: Matthias Stange
Abstract:
In the early modern period, a different image of man began to prevail in Europe. The focus was on the self-determined human being and his abilities. At first, these developments could be seen in Italian painting and architecture, which again oriented itself to the concepts and forms of antiquity. For example, through the discovery of perspective representation by Brunelleschi or later the orthogonal projection by Alberti, after the ancient knowledge of optics had been forgotten in the Middle Ages. The understanding of reality in the Middle Ages was not focused on the sensually perceptible world, but was determined by ecclesiastical dogmas. The empirical part of this study examines the rediscovery and development of perspective. With the paradigm of antiquity, the figure of the architect was also recognised again - the cultural man trained theoretically and practically in numerous subjects, as Vitruvius describes him. In this context, the role of the architect, the influence on the painting of the Quattrocento as well as the influence on architectural representation in the Baroque period are examined. Baroque is commonly associated with the idea of illusionistic appearance as opposed to the tangible reality presented in the Renaissance. The study has shown that the central perspective projection developed by Filippo Brunelleschi enabled another understanding of seeing and the dissemination of painted images. Brunelleschi's development made it possible to understand the sight of nature as a reflection of what is presented to the viewer's eye. Alberti later shortened Brunelleschi's central perspective representation for practical use in painting. In early modern Italian architecture and painting, these developments apparently supported each other. The pictorial representation of architecture initially served the development of an art form before it became established in building practice itself.
Keywords: Alberti, Brunelleschi, Central perspective projection, Orthogonal projection, Quattrocento, Baroque.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 185837 Climate Safe House: A Community Housing Project Tackling Catastrophic Sea Level Rise in Coastal Communities
Authors: Chris Fersterer, Col Fay, Tobias Danielmeier, Kat Achterberg, Scott Willis
Abstract:
New Zealand, an island nation, has an extensive coastline peppered with small communities of iconic buildings known as Bachs. Post WWII, these modest buildings were constructed by their owners as retreats and generally were small, low cost, often using recycled material and often they fell below current acceptable building standards. In the latter part of the 20th century, real estate prices in many of these communities remained low and these areas became permanent residences for people attracted to this affordable lifestyle choice. The Blueskin Resilient Communities Trust (BRCT) is an organisation that recognises the vulnerability of communities in low lying settlements as now being prone to increased flood threat brought about by climate change and sea level rise. Some of the inhabitants of Blueskin Bay, Otago, NZ have already found their properties to be un-insurable because of increased frequency of flood events and property values have slumped accordingly. Territorial authorities also acknowledge this increased risk and have created additional compliance measures for new buildings that are less than 2 m above tidal peaks. Community resilience becomes an additional concern where inhabitants are attracted to a lifestyle associated with a specific location and its people when this lifestyle is unable to be met in a suburban or city context. Traditional models of social housing fail to provide the sense of community connectedness and identity enjoyed by the current residents of Blueskin Bay. BRCT have partnered with the Otago Polytechnic Design School to design a new form of community housing that can react to this environmental change. It is a longitudinal project incorporating participatory approaches as a means of getting people ‘on board’, to understand complex systems and co-develop solutions. In the first period, they are seeking industry support and funding to develop a transportable and fully self-contained housing model that exploits current technologies. BRCT also hope that the building will become an educational tool to highlight climate change issues facing us today. This paper uses the Climate Safe House (CSH) as a case study for education in architectural sustainability through experiential learning offered as part of the Otago Polytechnics Bachelor of Design. Students engage with the project with research methodologies, including site surveys, resident interviews, data sourced from government agencies and physical modelling. The process involves collaboration across design disciplines including product and interior design but also includes connections with industry, both within the education institution and stakeholder industries introduced through BRCT. This project offers a rich learning environment where students become engaged through project based learning within a community of practice, including architecture, construction, energy and other related fields. The design outcomes are expressed in a series of public exhibitions and forums where community input is sought in a truly participatory process.Keywords: Community resilience, problem based learning, project based learning, case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 969836 ECG Based Reliable User Identification Using Deep Learning
Authors: R. N. Begum, Ambalika Sharma, G. K. Singh
Abstract:
Identity theft has serious ramifications beyond data and personal information loss. This necessitates the implementation of robust and efficient user identification systems. Therefore, automatic biometric recognition systems are the need of the hour, and electrocardiogram (ECG)-based systems are unquestionably the best choice due to their appealing inherent characteristics. The Convolutional Neural Networks (CNNs) are the recent state-of-the-art techniques for ECG-based user identification systems. However, the results obtained are significantly below standards, and the situation worsens as the number of users and types of heartbeats in the dataset grows. As a result, this study proposes a highly accurate and resilient ECG-based person identification system using CNN's dense learning framework. The proposed research explores explicitly the caliber of dense CNNs in the field of ECG-based human recognition. The study tests four different configurations of dense CNN which are trained on a dataset of recordings collected from eight popular ECG databases. With the highest False Acceptance Rate (FAR) of 0.04% and the highest False Rejection Rate (FRR) of 5%, the best performing network achieved an identification accuracy of 99.94%. The best network is also tested with various train/test split ratios. The findings show that DenseNets are not only extremely reliable, but also highly efficient. Thus, they might also be implemented in real-time ECG-based human recognition systems.
Keywords: Biometrics, dense networks, identification rate, train/test split ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 549835 Pilot Study on the Impact of VLE on Mathematical Concepts Acquisition within Secondary Education in England
Authors: Aaron A. R. Nwabude
Abstract:
The research investigates the “impact of VLE on mathematical concepts acquisition of the special education needs (SENs) students at KS4 secondary education sector" in England. The overall aim of the study is to establish possible areas of difficulties to approach for above or below knowledge standard requirements for KS4 students in the acquisition and validation of basic mathematical concepts. A teaching period, in which virtual learning environment (Fronter) was used to emphasise different mathematical perception and symbolic representation was carried out and task based survey conducted to 20 special education needs students [14 actually took part]. The result shows that students were able to process information and consider images, objects and numbers within the VLE at early stages of acquisition process. They were also able to carry out perceptual tasks but with limiting process of different quotient, thus they need teacher-s guidance to connect them to symbolic representations and sometimes coach them through. The pilot study further indicates that VLE curriculum approaches for students were minutely aligned with mathematics teaching which does not emphasise the integration of VLE into the existing curriculum and current teaching practice. There was also poor alignment of vision regarding the use of VLE in realisation of the objectives of teaching mathematics by the management. On the part of teacher training, not much was done to develop teacher-s skills in the technical and pedagogical aspects of VLE that is in-use at the school. The classroom observation confirmed teaching practice will find a reliance on VLE as an enhancer of mathematical skills, providing interaction and personalisation of learning to SEN students.
Keywords: VLE, Mathematical Concepts Acquisition, PilotStudy, SENs, KS4, Education, Teacher
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1456834 Complex-Valued Neural Network in Image Recognition: A Study on the Effectiveness of Radial Basis Function
Authors: Anupama Pande, Vishik Goel
Abstract:
A complex valued neural network is a neural network, which consists of complex valued input and/or weights and/or thresholds and/or activation functions. Complex-valued neural networks have been widening the scope of applications not only in electronics and informatics, but also in social systems. One of the most important applications of the complex valued neural network is in image and vision processing. In Neural networks, radial basis functions are often used for interpolation in multidimensional space. A Radial Basis function is a function, which has built into it a distance criterion with respect to a centre. Radial basis functions have often been applied in the area of neural networks where they may be used as a replacement for the sigmoid hidden layer transfer characteristic in multi-layer perceptron. This paper aims to present exhaustive results of using RBF units in a complex-valued neural network model that uses the back-propagation algorithm (called 'Complex-BP') for learning. Our experiments results demonstrate the effectiveness of a Radial basis function in a complex valued neural network in image recognition over a real valued neural network. We have studied and stated various observations like effect of learning rates, ranges of the initial weights randomly selected, error functions used and number of iterations for the convergence of error on a neural network model with RBF units. Some inherent properties of this complex back propagation algorithm are also studied and discussed.
Keywords: Complex valued neural network, Radial BasisFunction, Image recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2415833 Investigating Interference Errors Made by Azzawia University 1st year Students of English in Learning English Prepositions
Authors: Aimen Mohamed Almaloul
Abstract:
The main focus of this study is investigating the interference of Arabic in the use of English prepositions by Libyan university students. Prepositions in the tests used in the study were categorized, according to their relation to Arabic, into similar Arabic and English prepositions (SAEP), dissimilar Arabic and English prepositions (DAEP), Arabic prepositions with no English counterparts (APEC), and English prepositions with no Arabic counterparts (EPAC).
The subjects of the study were the first year university students of the English department, Sabrata Faculty of Arts, Azzawia University; both males and females, and they were 100 students. The basic tool for data collection was a test of English prepositions; students are instructed to fill in the blanks with the correct prepositions and to put a zero (0) if no preposition was needed. The test was then handed to the subjects of the study.
The test was then scored and quantitative as well as qualitative results were obtained. Quantitative results indicated the number, percentages and rank order of errors in each of the categories and qualitative results indicated the nature and significance of those errors and their possible sources. Based on the obtained results the researcher could detect that students made more errors in the EPAC category than the other three categories and these errors could be attributed to the lack of knowledge of the different meanings of English prepositions. This lack of knowledge forced the students to adopt what is called the strategy of transfer.
Keywords: Foreign language acquisition, foreign language learning, interference system, interlanguage system, mother tongue interference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5050832 Tariff as a Determining Factor in Choosing Mobile Operators: A Case Study from Higher Learning Institution in Dodoma Municipality in Tanzania
Authors: Justinian Anatory, Ekael Stephen Manase
Abstract:
In recent years, the adoption of mobile phones has been exceptionally rapid in many parts of the world, and Tanzania is not exceptional. We are witnessing a number of new mobile network operators being licensed from time to time by Tanzania Communications Regulatory Authority (TCRA). This makes competition in the telecommunications market very stiff. All mobile phone companies are struggling to earn more new customers into their networks. This trend courses a stiff competition. The various measures are being taken by different companies including, lowering tariff, and introducing free short messages within and out of their networks, and free calls during off-peak periods. This paper is aimed at investigating the influence of tariffs on students’ mobile customers in selecting their mobile network operators. About seventy seven students from high learning institutions in Dodoma Municipality, Tanzania, participated in responding to the prepared questionnaires. The sought information was aimed at determining if tariffs influenced students into selection of their current mobile operators. The results indicate that tariffs were the major driving factor in selection of mobile operators. However, female mobile customers were found to be more easily attracted into subscribing to a mobile operator due to low tariffs, a bigger number of free short messages or discounted call charges than their fellow male customers.
Keywords: Consumer Buying, mobile operators, tariff.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2246831 A Neurofuzzy Learning and its Application to Control System
Authors: Seema Chopra, R. Mitra, Vijay Kumar
Abstract:
A neurofuzzy approach for a given set of input-output training data is proposed in two phases. Firstly, the data set is partitioned automatically into a set of clusters. Then a fuzzy if-then rule is extracted from each cluster to form a fuzzy rule base. Secondly, a fuzzy neural network is constructed accordingly and parameters are tuned to increase the precision of the fuzzy rule base. This network is able to learn and optimize the rule base of a Sugeno like Fuzzy inference system using Hybrid learning algorithm, which combines gradient descent, and least mean square algorithm. This proposed neurofuzzy system has the advantage of determining the number of rules automatically and also reduce the number of rules, decrease computational time, learns faster and consumes less memory. The authors also investigate that how neurofuzzy techniques can be applied in the area of control theory to design a fuzzy controller for linear and nonlinear dynamic systems modelling from a set of input/output data. The simulation analysis on a wide range of processes, to identify nonlinear components on-linely in a control system and a benchmark problem involving the prediction of a chaotic time series is carried out. Furthermore, the well-known examples of linear and nonlinear systems are also simulated under the Matlab/Simulink environment. The above combination is also illustrated in modeling the relationship between automobile trips and demographic factors.
Keywords: Fuzzy control, neuro-fuzzy techniques, fuzzy subtractive clustering, extraction of rules, and optimization of membership functions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2598830 Understanding and Designing Situation-Aware Mobile and Ubiquitous Computing Systems
Authors: Kai Häussermann, Christoph Hubig, Paul Levi, Frank Leymann, Oliver Siemoneit, Matthias Wieland, Oliver Zweigle
Abstract:
Using spatial models as a shared common basis of information about the environment for different kinds of contextaware systems has been a heavily researched topic in the last years. Thereby the research focused on how to create, to update, and to merge spatial models so as to enable highly dynamic, consistent and coherent spatial models at large scale. In this paper however, we want to concentrate on how context-aware applications could use this information so as to adapt their behavior according to the situation they are in. The main idea is to provide the spatial model infrastructure with a situation recognition component based on generic situation templates. A situation template is – as part of a much larger situation template library – an abstract, machinereadable description of a certain basic situation type, which could be used by different applications to evaluate their situation. In this paper, different theoretical and practical issues – technical, ethical and philosophical ones – are discussed important for understanding and developing situation dependent systems based on situation templates. A basic system design is presented which allows for the reasoning with uncertain data using an improved version of a learning algorithm for the automatic adaption of situation templates. Finally, for supporting the development of adaptive applications, we present a new situation-aware adaptation concept based on workflows.Keywords: context-awareness, ethics, facilitation of system use through workflows, situation recognition and learning based on situation templates and situation ontology's, theory of situationaware systems
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763829 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials
Authors: S. Huber, B. Schnalzer, B. Alcalde, S. Hanke, L. Mpaltadoros, T. G. Stavropoulos, S. Nikolopoulos, I. Kompatsiaris, L. Pérez-Breva, V. Rodrigo-Casares, J. Fons-Martínez, J. de Bruin
Abstract:
Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits, and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depending on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.
Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774828 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity
Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Mujeeb Ur Rehman, Saifur Rahman Sabuj
Abstract:
This paper examines relationships between solar activity and earthquakes, it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity, and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to effect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth
.Keywords: K-Nearest Neighbour, Support Vector Regression, Random Forest Regression, Long Short-Term Memory Network, earthquakes, solar activity, sunspot number, solar wind, solar flares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 220827 Improving Activity Recognition Classification of Repetitious Beginner Swimming Using a 2-Step Peak/Valley Segmentation Method with Smoothing and Resampling for Machine Learning
Authors: Larry Powell, Seth Polsley, Drew Casey, Tracy Hammond
Abstract:
Human activity recognition (HAR) systems have shown positive performance when recognizing repetitive activities like walking, running, and sleeping. Water-based activities are a reasonably new area for activity recognition. However, water-based activity recognition has largely focused on supporting the elite and competitive swimming population, which already has amazing coordination and proper form. Beginner swimmers are not perfect, and activity recognition needs to support the individual motions to help beginners. Activity recognition algorithms are traditionally built around short segments of timed sensor data. Using a time window input can cause performance issues in the machine learning model. The window’s size can be too small or large, requiring careful tuning and precise data segmentation. In this work, we present a method that uses a time window as the initial segmentation, then separates the data based on the change in the sensor value. Our system uses a multi-phase segmentation method that pulls all peaks and valleys for each axis of an accelerometer placed on the swimmer’s lower back. This results in high recognition performance using leave-one-subject-out validation on our study with 20 beginner swimmers, with our model optimized from our final dataset resulting in an F-Score of 0.95.
Keywords: Time window, peak/valley segmentation, feature extraction, beginner swimming, activity recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209826 Using Thinking Blocks to Encourage the Use of Higher Order Thinking Skills among Students When Solving Problems on Fractions
Authors: Abdul Halim Abdullah, Nur Liyana Zainal Abidin, Mahani Mokhtar
Abstract:
Problem-solving is an activity which can encourage students to use Higher Order Thinking Skills (HOTS). Learning fractions can be challenging for students since empirical evidence shows that students experience difficulties in solving the fraction problems. However, visual methods can help students to overcome the difficulties since the methods help students to make meaningful visual representations and link abstract concepts in Mathematics. Therefore, the purpose of this study was to investigate whether there were any changes in students’ HOTS at the four highest levels when learning the fractions by using Thinking Blocks. 54 students participated in a quasi-experiment using pre-tests and post-tests. Students were divided into two groups. The experimental group (n=32) received a treatment to improve the students’ HOTS and the other group acted as the control group (n=22) which used a traditional method. Data were analysed by using Mann-Whitney test. The results indicated that during post-test, students who used Thinking Blocks showed significant improvement in their HOTS level (p=0.000). In addition, the results of post-test also showed that the students’ performance improved significantly at the four highest levels of HOTS; namely, application (p=0.001), analyse (p=0.000), evaluate (p=0.000), and create (p=0.000). Therefore, it can be concluded that Thinking Blocks can effectively encourage students to use the four highest levels of HOTS which consequently enable them to solve fractions problems successfully.Keywords: Thinking blocks, higher order thinking skills, fractions, problem solving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1362825 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: Cold-start, expectation propagation, multi-armed bandits, Thompson sampling, variational inference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 554824 A Static Android Malware Detection Based on Actual Used Permissions Combination and API Calls
Authors: Xiaoqing Wang, Junfeng Wang, Xiaolan Zhu
Abstract:
Android operating system has been recognized by most application developers because of its good open-source and compatibility, which enriches the categories of applications greatly. However, it has become the target of malware attackers due to the lack of strict security supervision mechanisms, which leads to the rapid growth of malware, thus bringing serious safety hazards to users. Therefore, it is critical to detect Android malware effectively. Generally, the permissions declared in the AndroidManifest.xml can reflect the function and behavior of the application to a large extent. Since current Android system has not any restrictions to the number of permissions that an application can request, developers tend to apply more than actually needed permissions in order to ensure the successful running of the application, which results in the abuse of permissions. However, some traditional detection methods only consider the requested permissions and ignore whether it is actually used, which leads to incorrect identification of some malwares. Therefore, a machine learning detection method based on the actually used permissions combination and API calls was put forward in this paper. Meanwhile, several experiments are conducted to evaluate our methodology. The result shows that it can detect unknown malware effectively with higher true positive rate and accuracy while maintaining a low false positive rate. Consequently, the AdaboostM1 (J48) classification algorithm based on information gain feature selection algorithm has the best detection result, which can achieve an accuracy of 99.8%, a true positive rate of 99.6% and a lowest false positive rate of 0.Keywords: Android, permissions combination, API calls, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920