Search results for: fly ash based geopolymer
27344 Cost-Effective Hybrid Cloud Framework for HEI’s
Authors: Shah Muhammad Butt, Ahmed Masaud Ansari
Abstract:
Present Financial crisis in Higher Educational Institutes (HEIs) facing lots of problems considerable budget cuts, make difficult to meet the ever growing IT-based research and learning needs, institutions are rapidly planning and promoting cloud-based approaches for their academic and research needs. A cost effective Hybrid Cloud framework for HEI’s will provide educational services for campus or intercampus communication. Hybrid Cloud Framework comprises Private and Public Cloud approaches. This paper will propose the framework based on the Open Source Cloud (OpenNebula for Virtualization, Eucalyptus for Infrastructure, and Aneka for programming development environment) combined with CSP’s services which are delivered to the end-user via the Internet from public clouds.Keywords: educational services, hybrid campus cloud, open source, electrical and systems sciences
Procedia PDF Downloads 45827343 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models
Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini
Abstract:
The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion
Procedia PDF Downloads 14027342 Model Solutions for Performance-Based Seismic Analysis of an Anchored Sheet Pile Quay Wall
Authors: C. J. W. Habets, D. J. Peters, J. G. de Gijt, A. V. Metrikine, S. N. Jonkman
Abstract:
Conventional seismic designs of quay walls in ports are mostly based on pseudo-static analysis. A more advanced alternative is the Performance-Based Design (PBD) method, which evaluates permanent deformations and amounts of (repairable) damage under seismic loading. The aim of this study is to investigate the suitability of this method for anchored sheet pile quay walls that were not purposely designed for seismic loads. A research methodology is developed in which pseudo-static, permanent-displacement and finite element analysis are employed, calibrated with an experimental reference case that considers a typical anchored sheet pile wall. A reduction factor that accounts for deformation behaviour is determined for pseudo-static analysis. A model to apply traditional permanent displacement analysis on anchored sheet pile walls is proposed. Dynamic analysis is successfully carried out. From the research it is concluded that PBD evaluation can effectively be used for seismic analysis and design of this type of structure.Keywords: anchored sheet pile quay wall, simplified dynamic analysis, performance-based design, pseudo-static analysis
Procedia PDF Downloads 37927341 Early Detection of Breast Cancer in Digital Mammograms Based on Image Processing and Artificial Intelligence
Authors: Sehreen Moorat, Mussarat Lakho
Abstract:
A method of artificial intelligence using digital mammograms data has been proposed in this paper for detection of breast cancer. Many researchers have developed techniques for the early detection of breast cancer; the early diagnosis helps to save many lives. The detection of breast cancer through mammography is effective method which detects the cancer before it is felt and increases the survival rate. In this paper, we have purposed image processing technique for enhancing the image to detect the graphical table data and markings. Texture features based on Gray-Level Co-Occurrence Matrix and intensity based features are extracted from the selected region. For classification purpose, neural network based supervised classifier system has been used which can discriminate between benign and malignant. Hence, 68 digital mammograms have been used to train the classifier. The obtained result proved that automated detection of breast cancer is beneficial for early diagnosis and increases the survival rates of breast cancer patients. The proposed system will help radiologist in the better interpretation of breast cancer.Keywords: medical imaging, cancer, processing, neural network
Procedia PDF Downloads 25927340 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 15827339 Self-Regulated Learning: A Required Skill for Web 2.0 Internet-Based Learning
Authors: Pieter Conradie, M. Marina Moller
Abstract:
Web 2.0 Internet-based technologies have intruded all aspects of human life. Presently, this phenomenon is especially evident in the educational context, with increased disruptive Web 2.0 technology infusions dramatically changing educational practice. The most prominent of these Web 2.0 intrusions can be identified as Massive Open Online Courses (Coursera, EdX), video and photo sharing sites (Youtube, Flickr, Instagram), and Web 2.0 online tools utilize to create Personal Learning Environments (PLEs) (Symbaloo (aggregator), Delicious (social bookmarking), PBWorks (collaboration), Google+ (social networks), Wordspress (blogs), Wikispaces (wiki)). These Web 2.0 technologies have supported the realignment from a teacher-based pedagogy (didactic presentation) to a learner-based pedagogy (problem-based learning, project-based learning, blended learning), allowing greater learner autonomy. No longer is the educator the source of knowledge. Instead the educator has become the facilitator and mediator of the learner, involved in developing learner competencies to support life-long learning (continuous learning) in the 21st century. In this study, the self-regulated learning skills of thirty first-year university learners were explored by utilizing the Online Self-regulated Learning Questionnaire. Implementing an action research method, an intervention was affected towards improving the self-regulation skill set of the participants. Statistical significant results were obtained with increased self-regulated learning proficiency, positively impacting learner performance. Goal setting, time management, environment structuring, help seeking, task (learning) strategies and self-evaluation skills were confirmed as determinants of improved learner success.Keywords: andragogy, online self-regulated learning questionnaire, self-regulated learning, web 2.0
Procedia PDF Downloads 41727338 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24427337 Comparative Analysis of Dissimilarity Detection between Binary Images Based on Equivalency and Non-Equivalency of Image Inversion
Authors: Adnan A. Y. Mustafa
Abstract:
Image matching is a fundamental problem that arises frequently in many aspects of robot and computer vision. It can become a time-consuming process when matching images to a database consisting of hundreds of images, especially if the images are big. One approach to reducing the time complexity of the matching process is to reduce the search space in a pre-matching stage, by simply removing dissimilar images quickly. The Probabilistic Matching Model for Binary Images (PMMBI) showed that dissimilarity detection between binary images can be accomplished quickly by random pixel mapping and is size invariant. The model is based on the gamma binary similarity distance that recognizes an image and its inverse as containing the same scene and hence considers them to be the same image. However, in many applications, an image and its inverse are not treated as being the same but rather dissimilar. In this paper, we present a comparative analysis of dissimilarity detection between PMMBI based on the gamma binary similarity distance and a modified PMMBI model based on a similarity distance that does distinguish between an image and its inverse as being dissimilar.Keywords: binary image, dissimilarity detection, probabilistic matching model for binary images, image mapping
Procedia PDF Downloads 15327336 Assessment-Assisted and Relationship-Based Financial Advising: Using an Empirical Assessment to Understand Personal Investor Risk Tolerance in Professional Advising Relationships
Authors: Jerry Szatko, Edan L. Jorgensen, Stacia Jorgensen
Abstract:
A crucial component to the success of any financial advising relationship is for the financial professional to understand the perceptions, preferences and thought-processes carried by the financial clients they serve. Armed with this information, financial professionals are more quickly able to understand how they can tailor their approach to best match the individual preferences and needs of each personal investor. Our research explores the use of a quantitative assessment tool in the financial services industry to assist in the identification of the personal investor’s consumer behaviors, especially in terms of financial risk tolerance, as it relates to their financial decision making. Through this process, the Unitifi Consumer Insight Tool (UCIT) was created and refined to capture and categorize personal investor financial behavioral categories and the financial personality tendencies of individuals prior to the initiation of a financial advisement relationship. This paper discusses the use of this tool to place individuals in one of four behavior-based financial risk tolerance categories. Our discoveries and research were aided through administration of a web-based survey to a group of over 1,000 individuals. Our findings indicate that it is possible to use a quantitative assessment tool to assist in predicting the behavioral tendencies of personal consumers when faced with consumer financial risk and decisions.Keywords: behavior-based advising, financial relationship building, risk capacity based on behavior, risk tolerance, systematic way to assist in financial relationship building
Procedia PDF Downloads 16727335 Arabic Lexicon Learning to Analyze Sentiment in Microblogs
Authors: Mahmoud B. Rokaya
Abstract:
The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation
Procedia PDF Downloads 18927334 Demand for Index Based Micro-Insurance (IBMI) in Ethiopia
Authors: Ashenafi Sileshi Etefa, Bezawit Worku Yenealem
Abstract:
Micro-insurance is a relatively new concept that is just being introduced in Ethiopia. For an agrarian economy dominated by small holder farming and vulnerable to natural disasters, mainly drought, the need for an Index-Based Micro Insurance (IBMI) is crucial. Since IBMI solves moral hazard, adverse selection, and access issues to poor clients, it is preferable over traditional insurance products. IBMI is being piloted in drought prone areas of Ethiopia with the aim of learning and expanding the service across the country. This article analyses the demand of IBMI and the barriers to demand and finds that the demand for IBMI has so far been constrained by lack of awareness, trust issues, costliness, and the level of basis risk; and recommends reducing the basis risk and increasing the role of government and farmer cooperatives.Keywords: agriculture, index based micro-insurance (IBMI), drought, micro-finance institution (MFI)
Procedia PDF Downloads 29027333 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 26327332 Combining the Dynamic Conditional Correlation and Range-GARCH Models to Improve Covariance Forecasts
Authors: Piotr Fiszeder, Marcin Fałdziński, Peter Molnár
Abstract:
The dynamic conditional correlation model of Engle (2002) is one of the most popular multivariate volatility models. However, this model is based solely on closing prices. It has been documented in the literature that the high and low price of the day can be used in an efficient volatility estimation. We, therefore, suggest a model which incorporates high and low prices into the dynamic conditional correlation framework. Empirical evaluation of this model is conducted on three datasets: currencies, stocks, and commodity exchange-traded funds. The utilisation of realized variances and covariances as proxies for true variances and covariances allows us to reach a strong conclusion that our model outperforms not only the standard dynamic conditional correlation model but also a competing range-based dynamic conditional correlation model.Keywords: volatility, DCC model, high and low prices, range-based models, covariance forecasting
Procedia PDF Downloads 18327331 Investigation and Optimization of DNA Isolation Efficiency Using Ferrite-Based Magnetic Nanoparticles
Authors: Tímea Gerzsenyi, Ágnes M. Ilosvai, László Vanyorek, Emma Szőri-Dorogházi
Abstract:
DNA isolation is a crucial step in many molecular biological applications for diagnostic and research purposes. However, traditional extraction requires toxic reagents, and commercially available kits are expensive, this leading to the recently wide-spread method, the magnetic nanoparticle (MNP)-based DNA isolation. Different ferrite containing MNPs were examined and compared in their plasmid DNA isolation efficiency. Among the tested MNPs, one has never been used for the extraction of plasmid molecules, marking a distinct application. pDNA isolation process was optimized for each type of nanoparticle and the best protocol was selected based on different criteria: DNA quantity, quality and integrity. With the best-performing magnetic nanoparticle, which excelled in all aspects, further tests were performed to recover genomic DNA from bacterial cells and a protocol was developed.Keywords: DNA isolation, nanobiotechnology, magnetic nanoparticles, protocol optimization, pDNA, gDNA
Procedia PDF Downloads 1227330 A Review on the Use of Herbal Alternatives to Antibiotics in Poultry Diets
Authors: Sasan Chalaki, Seyed Ali Mirgholange, Touba Nadri, Saman Chalaki
Abstract:
In the current world, proper poultry nutrition has garnered special attention as one of the fundamental factors for enhancing their health and performance. Concerns related to the excessive use of antibiotics in the poultry industry and their role in antibiotic resistance have transformed this issue into a global challenge in public health and the environment. On the other hand, poultry farming plays a vital role as a primary source of meat and eggs in human nutrition, and improving their health and performance is crucial. One effective approach to enhance poultry nutrition is the utilization of the antibiotic properties of plant-based ingredients. The use of plant-based alternatives as natural antibiotics in poultry nutrition not only aids in improving poultry health and performance but also plays a significant role in reducing the consumption of synthetic antibiotics and preventing antibiotic resistance-related issues. Plants contain various antibacterial compounds, such as flavonoids, tannins, and essential oils. These compounds are recognized as active agents in combating bacteria. Plant-based antibiotics are compounds extracted from plants with antibacterial properties. They are acknowledged as effective substitutes for chemical antibiotics in poultry diets. The advantages of plant-based antibiotics include reducing the risk of resistance to chemical antibiotics, increasing poultry growth performance, and lowering the risk of disease transmission.Keywords: poultry, antibiotics, essential oils, plant-based
Procedia PDF Downloads 7827329 Bridge Health Monitoring: A Review
Authors: Mohammad Bakhshandeh
Abstract:
Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.Keywords: structural health monitoring (SHM), bridge health monitoring (BHM), sensor-based methods, machine-learning algorithms, and model-based techniques, sensor placement, data acquisition, data analysis
Procedia PDF Downloads 9027328 Case-Based Reasoning: A Hybrid Classification Model Improved with an Expert's Knowledge for High-Dimensional Problems
Authors: Bruno Trstenjak, Dzenana Donko
Abstract:
Data mining and classification of objects is the process of data analysis, using various machine learning techniques, which is used today in various fields of research. This paper presents a concept of hybrid classification model improved with the expert knowledge. The hybrid model in its algorithm has integrated several machine learning techniques (Information Gain, K-means, and Case-Based Reasoning) and the expert’s knowledge into one. The knowledge of experts is used to determine the importance of features. The paper presents the model algorithm and the results of the case study in which the emphasis was put on achieving the maximum classification accuracy without reducing the number of features.Keywords: case based reasoning, classification, expert's knowledge, hybrid model
Procedia PDF Downloads 36727327 Empirical Mode Decomposition Based Denoising by Customized Thresholding
Authors: Wahiba Mohguen, Raïs El’hadi Bekka
Abstract:
This paper presents a denoising method called EMD-Custom that was based on Empirical Mode Decomposition (EMD) and the modified Customized Thresholding Function (Custom) algorithms. EMD was applied to decompose adaptively a noisy signal into intrinsic mode functions (IMFs). Then, all the noisy IMFs got threshold by applying the presented thresholding function to suppress noise and to improve the signal to noise ratio (SNR). The method was tested on simulated data and real ECG signal, and the results were compared to the EMD-Based signal denoising methods using the soft and hard thresholding. The results showed the superior performance of the proposed EMD-Custom denoising over the traditional approach. The performances were evaluated in terms of SNR in dB, and Mean Square Error (MSE).Keywords: customized thresholding, ECG signal, EMD, hard thresholding, soft-thresholding
Procedia PDF Downloads 30227326 Rank-Based Chain-Mode Ensemble for Binary Classification
Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu
Abstract:
In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.Keywords: consensus, curse of correlation, imbalance classification, rank-based chain-mode ensemble
Procedia PDF Downloads 13827325 Design and Optimization of Open Loop Supply Chain Distribution Network Using Hybrid K-Means Cluster Based Heuristic Algorithm
Authors: P. Suresh, K. Gunasekaran, R. Thanigaivelan
Abstract:
Radio frequency identification (RFID) technology has been attracting considerable attention with the expectation of improved supply chain visibility for consumer goods, apparel, and pharmaceutical manufacturers, as well as retailers and government procurement agencies. It is also expected to improve the consumer shopping experience by making it more likely that the products they want to purchase are available. Recent announcements from some key retailers have brought interest in RFID to the forefront. A modified K- Means Cluster based Heuristic approach, Hybrid Genetic Algorithm (GA) - Simulated Annealing (SA) approach, Hybrid K-Means Cluster based Heuristic-GA and Hybrid K-Means Cluster based Heuristic-GA-SA for Open Loop Supply Chain Network problem are proposed. The study incorporated uniform crossover operator and combined crossover operator in GAs for solving open loop supply chain distribution network problem. The algorithms are tested on 50 randomly generated data set and compared with each other. The results of the numerical experiments show that the Hybrid K-means cluster based heuristic-GA-SA, when tested on 50 randomly generated data set, shows superior performance to the other methods for solving the open loop supply chain distribution network problem.Keywords: RFID, supply chain distribution network, open loop supply chain, genetic algorithm, simulated annealing
Procedia PDF Downloads 16527324 A Comparison of Neural Network and DOE-Regression Analysis for Predicting Resource Consumption of Manufacturing Processes
Authors: Frank Kuebler, Rolf Steinhilper
Abstract:
Artificial neural networks (ANN) as well as Design of Experiments (DOE) based regression analysis (RA) are mainly used for modeling of complex systems. Both methodologies are commonly applied in process and quality control of manufacturing processes. Due to the fact that resource efficiency has become a critical concern for manufacturing companies, these models needs to be extended to predict resource-consumption of manufacturing processes. This paper describes an approach to use neural networks as well as DOE based regression analysis for predicting resource consumption of manufacturing processes and gives a comparison of the achievable results based on an industrial case study of a turning process.Keywords: artificial neural network, design of experiments, regression analysis, resource efficiency, manufacturing process
Procedia PDF Downloads 52427323 Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network
Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah
Abstract:
Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.Keywords: CNN, deep-learning, facial emotion recognition, machine learning
Procedia PDF Downloads 9527322 A Survey of Recognizing of Daily Living Activities in Multi-User Smart Home Environments
Authors: Kulsoom S. Bughio, Naeem K. Janjua, Gordana Dermody, Leslie F. Sikos, Shamsul Islam
Abstract:
The advancement in information and communication technologies (ICT) and wireless sensor networks have played a pivotal role in the design and development of real-time healthcare solutions, mainly targeting the elderly living in health-assistive smart homes. Such smart homes are equipped with sensor technologies to detect and record activities of daily living (ADL). This survey reviews and evaluates existing approaches and techniques based on real-time sensor-based modeling and reasoning in single-user and multi-user environments. It classifies the approaches into three main categories: learning-based, knowledge-based, and hybrid, and evaluates how they handle temporal relations, granularity, and uncertainty. The survey also highlights open challenges across various disciplines (including computer and information sciences and health sciences) to encourage interdisciplinary research for the detection and recognition of ADLs and discusses future directions.Keywords: daily living activities, smart homes, single-user environment, multi-user environment
Procedia PDF Downloads 14127321 Design and Evaluation of an Online Case-Based Library for Technology Integration in Teacher Education
Authors: Mustafa Tevfik Hebebci, Ismail Sahin, Sirin Kucuk, Ismail Celik, Ahmet Oguz Akturk
Abstract:
ADDIE is an instructional design model which has the five core elements: analyze, design, develop, implement, and evaluate. The ADDIE approach provides a systematic process for the analysis of instructional needs, the design and development of instructional programs and materials, implementation of a program, and the evaluation of the effectiveness of an instruction. The case-based study is an instructional design model that is a variant of project-oriented learning. Collecting and analyzing stories can be used in two primary ways -perform task analysis and as a learning support during instruction- by instructional designers. Besides, teachers use technology to develop students’ thinking, enriching the learning environment and providing permanent learning. The purpose of this paper is to introduce an interactive online case-study library website developed in a national project. The design goal of the website is to provide interactive, enhanced, case-based and online educational resource for educators through the purpose and within the scope of a national project. The ADDIE instructional design model was used in the development of the website for the interactive case-based library. This web-based library contains the navigation menus as the follows: “Homepage”, "Registration", "Branches", "Aim of The Research", "About TPACK", "National Project", "Contact Us", etc. This library is developed on a web-based platform, which is important in terms of manageability, accessibility, and updateability of data. Users are able to sort the displayed case-studies by their titles, dates, ratings, view counts, etc. In addition, they encouraged to rate and comment on the case-studies. The usability test is used and the expert opinion is taken for the evaluation of the website. This website is a tool to integrate technology in education. It is believed that this website will be beneficial for pre-service and in-service teachers in terms of their professional developments.Keywords: design, ADDIE, case based library, technology integration
Procedia PDF Downloads 47927320 Evolution of Nettlespurge Oil Mud for Drilling Mud System: A Comparative Study of Diesel Oil and Nettlespurge Oil as Oil-Based Drilling Mud
Authors: Harsh Agarwal, Pratikkumar Patel, Maharshi Pathak
Abstract:
Recently the low prices of Crude oil and increase in strict environmental regulations limit limits the use of diesel based muds as these muds are relatively costlier and toxic, as a result disposal of cuttings into the eco-system is a major issue faced by the drilling industries. To overcome these issues faced by the Oil Industry, an attempt has been made to develop oil-in-water emulsion mud system using nettlespurge oil. Nettlespurge oil could be easily available and its cost is around ₹30/litre which is about half the price of diesel in India. Oil-based mud (OBM) was formulated with Nettlespurge oil extracted from Nettlespurge seeds using the Soxhlet extraction method. The formulated nettlespurge oil mud properties were analysed with diesel oil mud properties. The compared properties were rheological properties, yield point and gel strength, and mud density and filtration loss properties, fluid loss and filter cake. The mud density measurement showed that nettlespurge OBM was slightly higher than diesel OBM with mud density values of 9.175 lb/gal and 8.5 lb/gal, respectively, at barite content of 70 g. Thus it has a higher lubricating property. Additionally, the filtration loss test results showed that nettlespurge mud fluid loss volumes, oil was 11 ml, compared to diesel oil mud volume of 15 ml. The filtration loss test indicated that the nettlespurge oil mud with filter cake thickness of 2.2 mm had a cake characteristic of thin and squashy while the diesel oil mud resulted in filter cake thickness of 2.7 mm with cake characteristic of tenacious, rubbery and resilient. The filtration loss test results showed that nettlespurge oil mud fluid loss volumes was much less than the diesel based oil mud. The filtration loss test indicated that the nettlespurge oil mud filter cake thickness less than the diesel oil mud filter cake thickness. So Low formation damage and the emulsion stability effect was analysed with this experiment. The nettlespurge oil-in-water mud system had lower coefficient of friction than the diesel oil based mud system. All the rheological properties have shown better results relative to the diesel based oil mud. Therefore, with all the above mentioned factors and with the data of the conducted experiment we could conclude that the Nettlespurge oil based mud is economically and well as eco-logically much more feasible than the worn out and shabby diesel-based oil mud in the Drilling Industry.Keywords: economical feasible, ecological feasible, emulsion stability, nettle spurge oil, rheological properties, soxhlet extraction method
Procedia PDF Downloads 20327319 Satellite-Based Drought Monitoring in Korea: Methodologies and Merits
Authors: Joo-Heon Lee, Seo-Yeon Park, Chanyang Sur, Ho-Won Jang
Abstract:
Satellite-based remote sensing technique has been widely used in the area of drought and environmental monitoring to overcome the weakness of in-situ based monitoring. There are many advantages of remote sensing for drought watch in terms of data accessibility, monitoring resolution and types of available hydro-meteorological data including environmental areas. This study was focused on the applicability of drought monitoring based on satellite imageries by applying to the historical drought events, which had a huge impact on meteorological, agricultural, and hydrological drought. Satellite-based drought indices, the Standardized Precipitation Index (SPI) using Tropical Rainfall Measuring Mission (TRMM) and Global Precipitation Mission (GPM); Vegetation Health Index (VHI) using MODIS based Land Surface Temperature (LST), and Normalized Difference Vegetation Index (NDVI); and Scaled Drought Condition Index (SDCI) were evaluated to assess its capability to analyze the complex topography of the Korean peninsula. While the VHI was accurate when capturing moderate drought conditions in agricultural drought-damaged areas, the SDCI was relatively well monitored in hydrological drought-damaged areas. In addition, this study found correlations among various drought indices and applicability using Receiver Operating Characteristic (ROC) method, which will expand our understanding of the relationships between hydro-meteorological variables and drought events at global scale. The results of this research are expected to assist decision makers in taking timely and appropriate action in order to save millions of lives in drought-damaged areas.Keywords: drought monitoring, moderate resolution imaging spectroradiometer (MODIS), remote sensing, receiver operating characteristic (ROC)
Procedia PDF Downloads 32927318 Design of Bayesian MDS Sampling Plan Based on the Process Capability Index
Authors: Davood Shishebori, Mohammad Saber Fallah Nezhad, Sina Seifi
Abstract:
In this paper, a variable multiple dependent state (MDS) sampling plan is developed based on the process capability index using Bayesian approach. The optimal parameters of the developed sampling plan with respect to constraints related to the risk of consumer and producer are presented. Two comparison studies have been done. First, the methods of double sampling model, sampling plan for resubmitted lots and repetitive group sampling (RGS) plan are elaborated and average sample numbers of the developed MDS plan and other classical methods are compared. A comparison study between the developed MDS plan based on Bayesian approach and the exact probability distribution is carried out.Keywords: MDS sampling plan, RGS plan, sampling plan for resubmitted lots, process capability index (PCI), average sample number (ASN), Bayesian approach
Procedia PDF Downloads 30127317 Real-Time Network Anomaly Detection Systems Based on Machine-Learning Algorithms
Authors: Zahra Ramezanpanah, Joachim Carvallo, Aurelien Rodriguez
Abstract:
This paper aims to detect anomalies in streaming data using machine learning algorithms. In this regard, we designed two separate pipelines and evaluated the effectiveness of each separately. The first pipeline, based on supervised machine learning methods, consists of two phases. In the first phase, we trained several supervised models using the UNSW-NB15 data-set. We measured the efficiency of each using different performance metrics and selected the best model for the second phase. At the beginning of the second phase, we first, using Argus Server, sniffed a local area network. Several types of attacks were simulated and then sent the sniffed data to a running algorithm at short intervals. This algorithm can display the results of each packet of received data in real-time using the trained model. The second pipeline presented in this paper is based on unsupervised algorithms, in which a Temporal Graph Network (TGN) is used to monitor a local network. The TGN is trained to predict the probability of future states of the network based on its past behavior. Our contribution in this section is introducing an indicator to identify anomalies from these predicted probabilities.Keywords: temporal graph network, anomaly detection, cyber security, IDS
Procedia PDF Downloads 10327316 Collision Avoidance Based on Model Predictive Control for Nonlinear Octocopter Model
Authors: Doğan Yıldız, Aydan Müşerref Erkmen
Abstract:
The controller of the octocopter is mostly based on the PID controller. For complex maneuvers, PID controllers have limited performance capability like in collision avoidance. When an octocopter needs avoidance from an obstacle, it must instantly show an agile maneuver. Also, this kind of maneuver is affected severely by the nonlinear characteristic of octocopter. When these kinds of limitations are considered, the situation is highly challenging for the PID controller. In the proposed study, these challenges are tried to minimize by using the model predictive controller (MPC) for collision avoidance with a nonlinear octocopter model. The aim is to show that MPC-based collision avoidance has the capability to deal with fast varying conditions in case of obstacle detection and diminish the nonlinear effects of octocopter with varying disturbances.Keywords: model predictive control, nonlinear octocopter model, collision avoidance, obstacle detection
Procedia PDF Downloads 19127315 A Framework Based on Dempster-Shafer Theory of Evidence Algorithm for the Analysis of the TV-Viewers’ Behaviors
Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi
Abstract:
In this paper, we propose an approach of detecting the behavior of the viewers of a TV program in a non-controlled environment. The experiment we propose is based on the use of three types of connected objects (smartphone, smart watch, and a connected remote control). 23 participants were observed while watching their TV programs during three phases: before, during and after watching a TV program. Their behaviors were detected using an approach based on The Dempster Shafer Theory (DST) in two phases. The first phase is to approximate dynamically the mass functions using an approach based on the correlation coefficient. The second phase is to calculate the approximate mass functions. To approximate the mass functions, two approaches have been tested: the first approach was to divide each features data space into cells; each one has a specific probability distribution over the behaviors. The probability distributions were computed statistically (estimated by empirical distribution). The second approach was to predict the TV-viewing behaviors through the use of classifiers algorithms and add uncertainty to the prediction based on the uncertainty of the model. Results showed that mixing the fusion rule with the computation of the initial approximate mass functions using a classifier led to an overall of 96%, 95% and 96% success rate for the first, second and third TV-viewing phase respectively. The results were also compared to those found in the literature. This study aims to anticipate certain actions in order to maintain the attention of TV viewers towards the proposed TV programs with usual connected objects, taking into account the various uncertainties that can be generated.Keywords: Iot, TV-viewing behaviors identification, automatic classification, unconstrained environment
Procedia PDF Downloads 229