Search results for: iterative algorithms
601 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features
Authors: Rabab M. Ramadan, Elaraby A. Elgallad
Abstract:
With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.Keywords: iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, the Scale Invariant Feature Transform (SIFT)
Procedia PDF Downloads 235600 An Optimization Algorithm for Reducing the Liquid Oscillation in the Moving Containers
Authors: Reza Babajanivalashedi, Stefania Lo Feudo, Jean-Luc Dion
Abstract:
Liquid sloshing is a crucial problem for the dynamic of moving containers in the packaging industries. Sloshing issues have been so far mainly modeled within the framework of fluid dynamics or by using equivalent mechanical models with different kinds of movements and shapes of containers. Nevertheless, these approaches do not allow to determinate the shape of the free surface of the liquid in case of the irregular shape of the moving containers, so that experimental measurements may be required. If there is too much slosh in the moving tank, the liquid can be splashed out on the packages. So, the free surface oscillation must be controlled/reduced to eliminate the splashing. The purpose of this research is to propose an optimization algorithm for finding an optimum command law to reduce surface elevation. In the first step, the free surface of the liquid is simulated based on the separation variable and weak formulation models. Then Genetic and Gradient algorithms are developed for finding the optimum command law. The optimum command law is compared with existing command laws, and the results show that there is a significant difference in surface oscillation between optimum and existing command laws. This algorithm is applicable for different varieties of bottles in case of using the camera for detecting the liquid elevation, and it can produce new command laws for different kinds of tanks to reduce the surface oscillation and remove the splashing phenomenon.Keywords: sloshing phenomenon, separation variables, weak formulation, optimization algorithm, command law
Procedia PDF Downloads 154599 Classification of Potential Biomarkers in Breast Cancer Using Artificial Intelligence Algorithms and Anthropometric Datasets
Authors: Aref Aasi, Sahar Ebrahimi Bajgani, Erfan Aasi
Abstract:
Breast cancer (BC) continues to be the most frequent cancer in females and causes the highest number of cancer-related deaths in women worldwide. Inspired by recent advances in studying the relationship between different patient attributes and features and the disease, in this paper, we have tried to investigate the different classification methods for better diagnosis of BC in the early stages. In this regard, datasets from the University Hospital Centre of Coimbra were chosen, and different machine learning (ML)-based and neural network (NN) classifiers have been studied. For this purpose, we have selected favorable features among the nine provided attributes from the clinical dataset by using a random forest algorithm. This dataset consists of both healthy controls and BC patients, and it was noted that glucose, BMI, resistin, and age have the most importance, respectively. Moreover, we have analyzed these features with various ML-based classifier methods, including Decision Tree (DT), K-Nearest Neighbors (KNN), eXtreme Gradient Boosting (XGBoost), Logistic Regression (LR), Naive Bayes (NB), and Support Vector Machine (SVM) along with NN-based Multi-Layer Perceptron (MLP) classifier. The results revealed that among different techniques, the SVM and MLP classifiers have the most accuracy, with amounts of 96% and 92%, respectively. These results divulged that the adopted procedure could be used effectively for the classification of cancer cells, and also it encourages further experimental investigations with more collected data for other types of cancers.Keywords: breast cancer, diagnosis, machine learning, biomarker classification, neural network
Procedia PDF Downloads 139598 Applying And Connecting The Microgrid Of Artificial Intelligence In The Form Of A Spiral Model To Optimize Renewable Energy Sources
Authors: PR
Abstract:
Renewable energy is a sustainable substitute to fossil fuels, which are depleting and attributing to global warming as well as greenhouse gas emissions. Renewable energy innovations including solar, wind, and geothermal have grown significantly and play a critical role in meeting energy demands recently. Consequently, Artificial Intelligence (AI) could further enhance the benefits of renewable energy systems. The combination of renewable technologies and AI could facilitate the development of smart grids that can better manage energy distribution and storage. AI thus has the potential to optimize the efficiency and reliability of renewable energy systems, reduce costs, and improve their overall performance. The conventional methods of using smart micro-grids are to connect these micro-grids in series or parallel or a combination of series and parallel. Each of these methods has its advantages and disadvantages. In this study, the proposal of using the method of connecting microgrids in a spiral manner is investigated. One of the important reasons for choosing this type of structure is the two-way reinforcement and exchange of each inner layer with the outer and upstream layer. With this model, we have the ability to increase energy from a small amount to a significant amount based on exponential functions. The geometry used to close the smart microgrids is based on nature.This study provides an overview of the applications of algorithms and models of AI as well as its advantages and challenges in renewable energy systems.Keywords: artificial intelligence, renewable energy sources, spiral model, optimize
Procedia PDF Downloads 14597 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform
Authors: David Jurado, Carlos Ávila
Abstract:
Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis
Procedia PDF Downloads 84596 Comparative Analysis of Classification Methods in Determining Non-Active Student Characteristics in Indonesia Open University
Authors: Dewi Juliah Ratnaningsih, Imas Sukaesih Sitanggang
Abstract:
Classification is one of data mining techniques that aims to discover a model from training data that distinguishes records into the appropriate category or class. Data mining classification methods can be applied in education, for example, to determine the classification of non-active students in Indonesia Open University. This paper presents a comparison of three methods of classification: Naïve Bayes, Bagging, and C.45. The criteria used to evaluate the performance of three methods of classification are stratified cross-validation, confusion matrix, the value of the area under the ROC Curve (AUC), Recall, Precision, and F-measure. The data used for this paper are from the non-active Indonesia Open University students in registration period of 2004.1 to 2012.2. Target analysis requires that non-active students were divided into 3 groups: C1, C2, and C3. Data analyzed are as many as 4173 students. Results of the study show: (1) Bagging method gave a high degree of classification accuracy than Naïve Bayes and C.45, (2) the Bagging classification accuracy rate is 82.99 %, while the Naïve Bayes and C.45 are 80.04 % and 82.74 % respectively, (3) the result of Bagging classification tree method has a large number of nodes, so it is quite difficult in decision making, (4) classification of non-active Indonesia Open University student characteristics uses algorithms C.45, (5) based on the algorithm C.45, there are 5 interesting rules which can describe the characteristics of non-active Indonesia Open University students.Keywords: comparative analysis, data mining, clasiffication, Bagging, Naïve Bayes, C.45, non-active students, Indonesia Open University
Procedia PDF Downloads 316595 Crop Classification using Unmanned Aerial Vehicle Images
Authors: Iqra Yaseen
Abstract:
One of the well-known areas of computer science and engineering, image processing in the context of computer vision has been essential to automation. In remote sensing, medical science, and many other fields, it has made it easier to uncover previously undiscovered facts. Grading of diverse items is now possible because of neural network algorithms, categorization, and digital image processing. Its use in the classification of agricultural products, particularly in the grading of seeds or grains and their cultivars, is widely recognized. A grading and sorting system enables the preservation of time, consistency, and uniformity. Global population growth has led to an increase in demand for food staples, biofuel, and other agricultural products. To meet this demand, available resources must be used and managed more effectively. Image processing is rapidly growing in the field of agriculture. Many applications have been developed using this approach for crop identification and classification, land and disease detection and for measuring other parameters of crop. Vegetation localization is the base of performing these task. Vegetation helps to identify the area where the crop is present. The productivity of the agriculture industry can be increased via image processing that is based upon Unmanned Aerial Vehicle photography and satellite. In this paper we use the machine learning techniques like Convolutional Neural Network, deep learning, image processing, classification, You Only Live Once to UAV imaging dataset to divide the crop into distinct groups and choose the best way to use it.Keywords: image processing, UAV, YOLO, CNN, deep learning, classification
Procedia PDF Downloads 108594 Bridging the Educational Gap: A Curriculum Framework for Mass Timber Construction Education and Comparative Analysis of Physical vs. Virtual Prototypes in Construction Management
Authors: Farnaz Jafari
Abstract:
The surge in mass timber construction represents a pivotal moment in sustainable building practices, yet the lack of comprehensive education in construction management poses a challenge in harnessing this innovation effectively. This research endeavors to bridge this gap by developing a curriculum framework integrating mass timber construction into undergraduate and industry certificate programs. To optimize learning outcomes, the study explores the impact of two prototype formats -Virtual Reality (VR) simulations and physical mock-ups- on students' understanding and skill development. The curriculum framework aims to equip future construction managers with a holistic understanding of mass timber, covering its unique properties, construction methods, building codes, and sustainable advantages. The study adopts a mixed-methods approach, commencing with a systematic literature review and leveraging surveys and interviews with educators and industry professionals to identify existing educational gaps. The iterative development process involves incorporating stakeholder feedback into the curriculum. The evaluation of prototype impact employs pre- and post-tests administered to participants engaged in pilot programs. Through qualitative content analysis and quantitative statistical methods, the study seeks to compare the effectiveness of VR simulations and physical mock-ups in conveying knowledge and skills related to mass timber construction. The anticipated findings will illuminate the strengths and weaknesses of each approach, providing insights for future curriculum development. The curriculum's expected contribution to sustainable construction education lies in its emphasis on practical application, bridging the gap between theoretical knowledge and hands-on skills. The research also seeks to establish a standard for mass timber construction education, contributing to the field through a unique comparative analysis of VR simulations and physical mock-ups. The study's significance extends to the development of best practices and evidence-based recommendations for integrating technology and hands-on experiences in construction education. By addressing current educational gaps and offering a comparative analysis, this research aims to enrich the construction management education experience and pave the way for broader adoption of sustainable practices in the industry. The envisioned curriculum framework is designed for versatile integration, catering to undergraduate programs and industry training modules, thereby enhancing the educational landscape for aspiring construction professionals. Ultimately, this study underscores the importance of proactive educational strategies in preparing industry professionals for the evolving demands of the construction landscape, facilitating a seamless transition towards sustainable building practices.Keywords: curriculum framework, mass timber construction, physical vs. virtual prototypes, sustainable building practices
Procedia PDF Downloads 73593 Antibacterial Evaluation, in Silico ADME and QSAR Studies of Some Benzimidazole Derivatives
Authors: Strahinja Kovačević, Lidija Jevrić, Miloš Kuzmanović, Sanja Podunavac-Kuzmanović
Abstract:
In this paper, various derivatives of benzimidazole have been evaluated against Gram-negative bacteria Escherichia coli. For all investigated compounds the minimum inhibitory concentration (MIC) was determined. Quantitative structure-activity relationships (QSAR) attempts to find consistent relationships between the variations in the values of molecular properties and the biological activity for a series of compounds so that these rules can be used to evaluate new chemical entities. The correlation between MIC and some absorption, distribution, metabolism and excretion (ADME) parameters was investigated, and the mathematical models for predicting the antibacterial activity of this class of compounds were developed. The quality of the multiple linear regression (MLR) models was validated by the leave-one-out (LOO) technique, as well as by the calculation of the statistical parameters for the developed models and the results are discussed on the basis of the statistical data. The results of this study indicate that ADME parameters have a significant effect on the antibacterial activity of this class of compounds. Principal component analysis (PCA) and agglomerative hierarchical clustering algorithms (HCA) confirmed that the investigated molecules can be classified into groups on the basis of the ADME parameters: Madin-Darby Canine Kidney cell permeability (MDCK), Plasma protein binding (PPB%), human intestinal absorption (HIA%) and human colon carcinoma cell permeability (Caco-2).Keywords: benzimidazoles, QSAR, ADME, in silico
Procedia PDF Downloads 377592 A Variational Reformulation for the Thermomechanically Coupled Behavior of Shape Memory Alloys
Authors: Elisa Boatti, Ulisse Stefanelli, Alessandro Reali, Ferdinando Auricchio
Abstract:
Thanks to their unusual properties, shape memory alloys (SMAs) are good candidates for advanced applications in a wide range of engineering fields, such as automotive, robotics, civil, biomedical, aerospace. In the last decades, the ever-growing interest for such materials has boosted several research studies aimed at modeling their complex nonlinear behavior in an effective and robust way. Since the constitutive response of SMAs is strongly thermomechanically coupled, the investigation of the non-isothermal evolution of the material must be taken into consideration. The present study considers an existing three-dimensional phenomenological model for SMAs, able to reproduce the main SMA properties while maintaining a simple user-friendly structure, and proposes a variational reformulation of the full non-isothermal version of the model. While the considered model has been thoroughly assessed in an isothermal setting, the proposed formulation allows to take into account the full nonisothermal problem. In particular, the reformulation is inspired to the GENERIC (General Equations for Non-Equilibrium Reversible-Irreversible Coupling) formalism, and is based on a generalized gradient flow of the total entropy, related to thermal and mechanical variables. Such phrasing of the model is new and allows for a discussion of the model from both a theoretical and a numerical point of view. Moreover, it directly implies the dissipativity of the flow. A semi-implicit time-discrete scheme is also presented for the fully coupled thermomechanical system, and is proven unconditionally stable and convergent. The correspondent algorithm is then implemented, under a space-homogeneous temperature field assumption, and tested under different conditions. The core of the algorithm is composed of a mechanical subproblem and a thermal subproblem. The iterative scheme is solved by a generalized Newton method. Numerous uniaxial and biaxial tests are reported to assess the performance of the model and algorithm, including variable imposed strain, strain rate, heat exchange properties, and external temperature. In particular, the heat exchange with the environment is the only source of rate-dependency in the model. The reported curves clearly display the interdependence between phase transformation strain and material temperature. The full thermomechanical coupling allows to reproduce the exothermic and endothermic effects during respectively forward and backward phase transformation. The numerical tests have thus demonstrated that the model can appropriately reproduce the coupled SMA behavior in different loading conditions and rates. Moreover, the algorithm has proved effective and robust. Further developments are being considered, such as the extension of the formulation to the finite-strain setting and the study of the boundary value problem.Keywords: generalized gradient flow, GENERIC formalism, shape memory alloys, thermomechanical coupling
Procedia PDF Downloads 222591 Hardware in the Loop Platform for Virtual Commissioning: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Ana Maria Macarulla
Abstract:
Hydraulic-press commissioning consumes a great amount of man-hours, due to the fact that it takes place several miles away from where it has been designed. This factor became exacerbated due to control designers’ lack of knowledge about which will be the final controller gains before they start working with it. Virtual commissioning has been postulated as an optimal solution to deal with this lack of knowledge. Here, a case study is presented in which a controller is set up against a real-time model based on a hydraulic-press. The press model is designed following manufacturer specifications and it is embedded in a real-time simulator. This methodology ensures that the model achieves similar responses as the real machine that would be placed on the industry. A deterministic communication protocol is in charge of the bidirectional information transmission between the real-time model and the controller. This platform allows the engineer to test and verify the final control responses with exactly the same hardware that is going to be installed in the hydraulic-press, in other words, realize a virtual commissioning of the electro-hydraulic actuator. The Hardware in the Loop (HiL) platform validates in laboratory conditions and harmless for the machine the control algorithms designed, which allows embedding them afterwards in the industrial environment without further modifications.Keywords: deterministic communication protocol, electro-hydraulic actuator, hardware in the loop, real-time, virtual commissioning
Procedia PDF Downloads 143590 Stakeholder-Driven Development of a One Health Platform to Prevent Non-Alimentary Zoonoses
Authors: A. F. G. Van Woezik, L. M. A. Braakman-Jansen, O. A. Kulyk, J. E. W. C. Van Gemert-Pijnen
Abstract:
Background: Zoonoses pose a serious threat to public health and economies worldwide, especially as antimicrobial resistance grows and newly emerging zoonoses can cause unpredictable outbreaks. In order to prevent and control emerging and re-emerging zoonoses, collaboration between veterinary, human health and public health domains is essential. In reality however, there is a lack of cooperation between these three disciplines and uncertainties exist about their tasks and responsibilities. The objective of this ongoing research project (ZonMw funded, 2014-2018) is to develop an online education and communication One Health platform, “eZoon”, for the general public and professionals working in veterinary, human health and public health domains to support the risk communication of non-alimentary zoonoses in the Netherlands. The main focus is on education and communication in times of outbreak as well as in daily non-outbreak situations. Methods: A participatory development approach was used in which stakeholders from veterinary, human health and public health domains participated. Key stakeholders were identified using business modeling techniques previously used for the design and implementation of antibiotic stewardship interventions and consisted of a literature scan, expert recommendations, and snowball sampling. We used a stakeholder salience approach to rank stakeholders according to their power, legitimacy, and urgency. Semi-structured interviews were conducted with stakeholders (N=20) from all three disciplines to identify current problems in risk communication and stakeholder values for the One Health platform. Interviews were transcribed verbatim and coded inductively by two researchers. Results: The following key values were identified (but were not limited to): (a) need for improved awareness of veterinary and human health of each other’s fields, (b) information exchange between veterinary and human health, in particularly at a regional level; (c) legal regulations need to match with daily practice; (d) professionals and general public need to be addressed separately using tailored language and information; (e) information needs to be of value to professionals (relevant, important, accurate, and have financial or other important consequences if ignored) in order to be picked up; and (f) need for accurate information from trustworthy, centrally organised sources to inform the general public. Conclusion: By applying a participatory development approach, we gained insights from multiple perspectives into the main problems of current risk communication strategies in the Netherlands and stakeholder values. Next, we will continue the iterative development of the One Health platform by presenting key values to stakeholders for validation and ranking, which will guide further development. We will develop a communication platform with a serious game in which professionals at the regional level will be trained in shared decision making in time-critical outbreak situations, a smart Question & Answer (Q&A) system for the general public tailored towards different user profiles, and social media to inform the general public adequately during outbreaks.Keywords: ehealth, one health, risk communication, stakeholder, zoonosis
Procedia PDF Downloads 287589 Production Planning for Animal Food Industry under Demand Uncertainty
Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut
Abstract:
This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.Keywords: animal food, stochastic linear programming, aggregate planning, production planning, demand uncertainty
Procedia PDF Downloads 380588 BeamGA Median: A Hybrid Heuristic Search Approach
Authors: Ghada Badr, Manar Hosny, Nuha Bintayyash, Eman Albilali, Souad Larabi Marie-Sainte
Abstract:
The median problem is significantly applied to derive the most reasonable rearrangement phylogenetic tree for many species. More specifically, the problem is concerned with finding a permutation that minimizes the sum of distances between itself and a set of three signed permutations. Genomes with equal number of genes but different order can be represented as permutations. In this paper, an algorithm, namely BeamGA median, is proposed that combines a heuristic search approach (local beam) as an initialization step to generate a number of solutions, and then a Genetic Algorithm (GA) is applied in order to refine the solutions, aiming to achieve a better median with the smallest possible reversal distance from the three original permutations. In this approach, any genome rearrangement distance can be applied. In this paper, we use the reversal distance. To the best of our knowledge, the proposed approach was not applied before for solving the median problem. Our approach considers true biological evolution scenario by applying the concept of common intervals during the GA optimization process. This allows us to imitate a true biological behavior and enhance genetic approach time convergence. We were able to handle permutations with a large number of genes, within an acceptable time performance and with same or better accuracy as compared to existing algorithms.Keywords: median problem, phylogenetic tree, permutation, genetic algorithm, beam search, genome rearrangement distance
Procedia PDF Downloads 266587 Design of the Intelligent Virtual Learning Coach. A Contextual Learning Approach to Digital Literacy of Senior Learners in the Context of Electronic Health Record (EHR)
Authors: Ilona Buchem, Carolin Gellner
Abstract:
The call for the support of senior learners in the development of digital literacy has become prevalent in recent years, especially in view of the aging societies paired with advances in digitalization in all spheres of life, including e-health. The goal has been to create opportunities for learning that incorporate the use of context in a reflective and dialogical way. Contextual learning has focused on developing skills through the application of authentic problems. While major research efforts in supporting senior learners in developing digital literacy have been invested so far in e-learning, focusing on knowledge acquisition and cognitive tasks, little research exists in reflective mentoring and coaching with the help of pedagogical agents and addressing the contextual dimensions of learning. This paper describes an approach to creating opportunities for senior learners to improve their digital literacy in the authentic context of the electronic health record (EHR) with the support of an intelligent virtual learning coach. The paper focuses on the design of the virtual coach as part of an e-learning system, which was developed in the EPA-Coach project founded by the German Ministry of Education and Research. The paper starts with the theoretical underpinnings of contextual learning and the related design considerations for a virtual learning coach based on previous studies. Since previous research in the area was mostly designed to cater to the needs of younger audiences, the results had to be adapted to the specific needs of senior learners. Next, the paper outlines the stages in the design of the virtual coach, which included the adaptation of the design requirements, the iterative development of the prototypes, the results of the two evaluation studies and how these results were used to improve the design of the virtual coach. The paper then presents the four prototypes of a senior-friendly virtual learning coach, which were designed to represent different preferences related to the visual appearance, the communication and social interaction styles, and the pedagogical roles. The first evaluation of the virtual coach design was an exploratory, qualitative study, which was carried out in October 2020 with eight seniors aged 64 to 78 and included a range of questions about the preferences of senior learners related to the visual design, gender, age, communication and role. Based on the results of the first evaluation, the design was adapted to the preferences of the senior learners and the new versions of prototypes were created to represent two male and two female options of the virtual coach. The second evaluation followed a quantitative approach with an online questionnaire and was conducted in May 2021 with 41 seniors aged 66 to 93 years. Following three research questions, the survey asked about (1) the intention to use, (2) the perceived characteristics, and (3) the preferred communication/interaction style of the virtual coach, i. e. task-oriented, relationship-oriented, or a mix. This paper follows with the discussion of the results of the design process and ends with conclusions and next steps in the development of the virtual coach including recommendations for further research.Keywords: virtual learning coach, virtual mentor, pedagogical agent, senior learners, digital literacy, electronic health records
Procedia PDF Downloads 180586 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques
Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian
Abstract:
Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.Keywords: data mining, k-means, road traffic accidents, Waze, Weka
Procedia PDF Downloads 418585 Charting Sentiments with Naive Bayes and Logistic Regression
Authors: Jummalla Aashrith, N. L. Shiva Sai, K. Bhavya Sri
Abstract:
The swift progress of web technology has not only amassed a vast reservoir of internet data but also triggered a substantial surge in data generation. The internet has metamorphosed into one of the dynamic hubs for online education, idea dissemination, as well as opinion-sharing. Notably, the widely utilized social networking platform Twitter is experiencing considerable expansion, providing users with the ability to share viewpoints, participate in discussions spanning diverse communities, and broadcast messages on a global scale. The upswing in online engagement has sparked a significant curiosity in subjective analysis, particularly when it comes to Twitter data. This research is committed to delving into sentiment analysis, focusing specifically on the realm of Twitter. It aims to offer valuable insights into deciphering information within tweets, where opinions manifest in a highly unstructured and diverse manner, spanning a spectrum from positivity to negativity, occasionally punctuated by neutrality expressions. Within this document, we offer a comprehensive exploration and comparative assessment of modern approaches to opinion mining. Employing a range of machine learning algorithms such as Naive Bayes and Logistic Regression, our investigation plunges into the domain of Twitter data streams. We delve into overarching challenges and applications inherent in the realm of subjectivity analysis over Twitter.Keywords: machine learning, sentiment analysis, visualisation, python
Procedia PDF Downloads 56584 Nonlinear Analysis in Investigating the Complexity of Neurophysiological Data during Reflex Behavior
Authors: Juliana A. Knocikova
Abstract:
Methods of nonlinear signal analysis are based on finding that random behavior can arise in deterministic nonlinear systems with a few degrees of freedom. Considering the dynamical systems, entropy is usually understood as a rate of information production. Changes in temporal dynamics of physiological data are indicating evolving of system in time, thus a level of new signal pattern generation. During last decades, many algorithms were introduced to assess some patterns of physiological responses to external stimulus. However, the reflex responses are usually characterized by short periods of time. This characteristic represents a great limitation for usual methods of nonlinear analysis. To solve the problems of short recordings, parameter of approximate entropy has been introduced as a measure of system complexity. Low value of this parameter is reflecting regularity and predictability in analyzed time series. On the other side, increasing of this parameter means unpredictability and a random behavior, hence a higher system complexity. Reduced neurophysiological data complexity has been observed repeatedly when analyzing electroneurogram and electromyogram activities during defence reflex responses. Quantitative phrenic neurogram changes are also obvious during severe hypoxia, as well as during airway reflex episodes. Concluding, the approximate entropy parameter serves as a convenient tool for analysis of reflex behavior characterized by short lasting time series.Keywords: approximate entropy, neurophysiological data, nonlinear dynamics, reflex
Procedia PDF Downloads 300583 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance
Authors: Loai AbdAllah, Mahmoud Kaiyal
Abstract:
Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.Keywords: missing values, incomplete data, distance, incomplete diabetes data
Procedia PDF Downloads 225582 Infodemic Detection on Social Media with a Multi-Dimensional Deep Learning Framework
Authors: Raymond Xu, Cindy Jingru Wang
Abstract:
Social media has become a globally connected and influencing platform. Social media data, such as tweets, can help predict the spread of pandemics and provide individuals and healthcare providers early warnings. Public psychological reactions and opinions can be efficiently monitored by AI models on the progression of dominant topics on Twitter. However, statistics show that as the coronavirus spreads, so does an infodemic of misinformation due to pandemic-related factors such as unemployment and lockdowns. Social media algorithms are often biased toward outrage by promoting content that people have an emotional reaction to and are likely to engage with. This can influence users’ attitudes and cause confusion. Therefore, social media is a double-edged sword. Combating fake news and biased content has become one of the essential tasks. This research analyzes the variety of methods used for fake news detection covering random forest, logistic regression, support vector machines, decision tree, naive Bayes, BoW, TF-IDF, LDA, CNN, RNN, LSTM, DeepFake, and hierarchical attention network. The performance of each method is analyzed. Based on these models’ achievements and limitations, a multi-dimensional AI framework is proposed to achieve higher accuracy in infodemic detection, especially pandemic-related news. The model is trained on contextual content, images, and news metadata.Keywords: artificial intelligence, fake news detection, infodemic detection, image recognition, sentiment analysis
Procedia PDF Downloads 259581 Workforce Optimization: Fair Workload Balance and Near-Optimal Task Execution Order
Authors: Alvaro Javier Ortega
Abstract:
A large number of companies face the challenge of matching highly-skilled professionals to high-end positions by human resource deployment professionals. However, when the professional list and tasks to be matched are larger than a few dozens, this process result is far from optimal and takes a long time to be made. Therefore, an automated assignment algorithm for this workforce management problem is needed. The majority of companies are divided into several sectors or departments, where trained employees with different experience levels deal with a large number of tasks daily. Also, the execution order of all tasks is of mater consequence, due to some of these tasks just can be run it if the result of another task is provided. Thus, a wrong execution order leads to large waiting times between consecutive tasks. The desired goal is, therefore, creating accurate matches and a near-optimal execution order that maximizes the number of tasks performed and minimizes the idle time of the expensive skilled employees. The problem described before can be model as a mixed-integer non-linear programming (MINLP) as it will be shown in detail through this paper. A large number of MINLP algorithms have been proposed in the literature. Here, genetic algorithm solutions are considered and a comparison between two different mutation approaches is presented. The simulated results considering different complexity levels of assignment decisions show the appropriateness of the proposed model.Keywords: employees, genetic algorithm, industry management, workforce
Procedia PDF Downloads 168580 Vision-Based Daily Routine Recognition for Healthcare with Transfer Learning
Authors: Bruce X. B. Yu, Yan Liu, Keith C. C. Chan
Abstract:
We propose to record Activities of Daily Living (ADLs) of elderly people using a vision-based system so as to provide better assistive and personalization technologies. Current ADL-related research is based on data collected with help from non-elderly subjects in laboratory environments and the activities performed are predetermined for the sole purpose of data collection. To obtain more realistic datasets for the application, we recorded ADLs for the elderly with data collected from real-world environment involving real elderly subjects. Motivated by the need to collect data for more effective research related to elderly care, we chose to collect data in the room of an elderly person. Specifically, we installed Kinect, a vision-based sensor on the ceiling, to capture the activities that the elderly subject performs in the morning every day. Based on the data, we identified 12 morning activities that the elderly person performs daily. To recognize these activities, we created a HARELCARE framework to investigate into the effectiveness of existing Human Activity Recognition (HAR) algorithms and propose the use of a transfer learning algorithm for HAR. We compared the performance, in terms of accuracy, and training progress. Although the collected dataset is relatively small, the proposed algorithm has a good potential to be applied to all daily routine activities for healthcare purposes such as evidence-based diagnosis and treatment.Keywords: daily activity recognition, healthcare, IoT sensors, transfer learning
Procedia PDF Downloads 132579 Multi-Agent System Based Distributed Voltage Control in Distribution Systems
Authors: A. Arshad, M. Lehtonen. M. Humayun
Abstract:
With the increasing Distributed Generation (DG) penetration, distribution systems are advancing towards the smart grid technology for least latency in tackling voltage control problem in a distributed manner. This paper proposes a Multi-agent based distributed voltage level control. In this method a flat architecture of agents is used and agents involved in the whole controlling procedure are On Load Tap Changer Agent (OLTCA), Static VAR Compensator Agent (SVCA), and the agents associated with DGs and loads at their locations. The objectives of the proposed voltage control model are to minimize network losses and DG curtailments while maintaining voltage value within statutory limits as close as possible to the nominal. The total loss cost is the sum of network losses cost, DG curtailment costs, and voltage damage cost (which is based on penalty function implementation). The total cost is iteratively calculated for various stricter limits by plotting voltage damage cost and losses cost against varying voltage limit band. The method provides the optimal limits closer to nominal value with minimum total loss cost. In order to achieve the objective of voltage control, the whole network is divided into multiple control regions; downstream from the controlling device. The OLTCA behaves as a supervisory agent and performs all the optimizations. At first, a token is generated by OLTCA on each time step and it transfers from node to node until the node with voltage violation is detected. Upon detection of such a node, the token grants permission to Load Agent (LA) for initiation of possible remedial actions. LA will contact the respective controlling devices dependent on the vicinity of the violated node. If the violated node does not lie in the vicinity of the controller or the controlling capabilities of all the downstream control devices are at their limits then OLTC is considered as a last resort. For a realistic study, simulations are performed for a typical Finnish residential medium-voltage distribution system using Matlab ®. These simulations are executed for two cases; simple Distributed Voltage Control (DVC) and DVC with optimized loss cost (DVC + Penalty Function). A sensitivity analysis is performed based on DG penetration. The results indicate that costs of losses and DG curtailments are directly proportional to the DG penetration, while in case 2 there is a significant reduction in total loss. For lower DG penetration, losses are reduced more or less 50%, while for higher DG penetration, loss reduction is not very significant. Another observation is that the newer stricter limits calculated by cost optimization moves towards the statutory limits of ±10% of the nominal with the increasing DG penetration as for 25, 45 and 65% limits calculated are ±5, ±6.25 and 8.75% respectively. Observed results conclude that the novel voltage control algorithm proposed in case 1 is able to deal with the voltage control problem instantly but with higher losses. In contrast, case 2 make sure to reduce the network losses through proposed iterative method of loss cost optimization by OLTCA, slowly with time.Keywords: distributed voltage control, distribution system, multi-agent systems, smart grids
Procedia PDF Downloads 312578 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection
Authors: Muhammad Ali
Abstract:
Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection
Procedia PDF Downloads 126577 Techno-Economic Assessments of Promising Chemicals from a Sugar Mill Based Biorefinery
Authors: Kathleen Frances Haigh, Mieke Nieder-Heitmann, Somayeh Farzad, Mohsen Ali Mandegari, Johann Ferdinand Gorgens
Abstract:
Lignocellulose can be converted to a range of biochemicals and biofuels. Where this is derived from agricultural waste, issues of competition with food are virtually eliminated. One such source of lignocellulose is the South African sugar industry. Lignocellulose could be accessed by changes to the current farming practices and investments in more efficient boilers. The South African sugar industry is struggling due to falling sugar prices and increasing costs and it is proposed that annexing a biorefinery to a sugar mill will broaden the product range and improve viability. Process simulations of the selected chemicals were generated using Aspen Plus®. It was envisaged that a biorefinery would be annexed to a typical South African sugar mill. Bagasse would be diverted from the existing boilers to the biorefinery and mixed with harvest residues. This biomass would provide the feedstock for the biorefinery and the process energy for the biorefinery and sugar mill. Thus, in all scenarios a portion of the biomass was diverted to a new efficient combined heat and power plant (CHP). The Aspen Plus® simulations provided the mass and energy balance data to carry out an economic assessment of each scenarios. The net present value (NPV), internal rate of return (IRR) and minimum selling price (MSP) was calculated for each scenario. As a starting point scenarios were generated to investigate the production of ethanol, ethanol and lactic acid, ethanol and furfural, butanol, methanol, and Fischer-Tropsch syncrude. The bypass to the CHP plant is a useful indicator of the energy demands of the chemical processes. An iterative approach was used to identify a suitable bypass because increasing this value had the combined effect of increasing the amount of energy available and reducing the capacity of the chemical plant. Bypass values ranged from 30% for syncrude production to 50% for combined ethanol and furfural production. A hurdle rate of 15.7% was selected for the IRR. The butanol, combined ethanol and furfural, or the Fischer-Tropsch syncrude scenarios are unsuitable for investment with IRRs of 4.8%, 7.5% and 11.5% respectively. This provides valuable insights into research opportunities. For example furfural from sugarcane bagasse is an established process although the integration of furfural production with ethanol is less well understood. The IRR for the ethanol scenario was 14.7%, which is below the investment criteria, but given the technological maturity it may still be considered for investment. The scenarios which met the investment criteria were the combined ethanol and lactic acid, and the methanol scenarios with IRRs of 20.5% and 16.7%, respectively. These assessments show that the production of biochemicals from lignocellulose can be commercially viable. In addition, this assessment have provided valuable insights for research to improve the commercial viability of additional chemicals and scenarios. This has led to further assessments of the production of itaconic acid, succinic acid, citric acid, xylitol, polyhydroxybutyrate, polyethylene, glucaric acid and glutamic acid.Keywords: biorefineries, sugar mill, methanol, ethanol
Procedia PDF Downloads 197576 Development and Preliminary Testing of the Dutch Version of the Program for the Education and Enrichment of Relational Skills
Authors: Sakinah Idris, Gabrine Jagersma, Bjorn Jaime Van Pelt, Kirstin Greaves-Lord
Abstract:
Background: The PEERS (Program for the Education and Enrichment of Relational Skills) intervention can be considered a well-established, evidence-based intervention in the USA. However, testing the efficacy of cultural adaptations of PEERS is still ongoing. More and more, the involvement of all stakeholders in the development and evaluation of interventions is acknowledged as crucial for the longer term implementation of interventions across settings. Therefore, in the current project, teens with ASD (Autism Spectrum Disorder), their neurotypical peers, parents, teachers, as well as clinicians were involved in the development and evaluation of the Dutch version of PEERS. Objectives: The current presentation covers (1) the formative phase and (2) the preliminary adaptation test phase of the cultural adaptation of evidence-based interventions. In the formative phase, we aim to describe the process of adaptation of the PEERS program to the Dutch culture and care system. In the preliminary adaptation phase, we will present results from the preliminary adaptation test among 32 adolescents with ASD. Methods: In phase 1, a group discussion on common vocabulary was conducted among 70 teenagers (and their teachers) from special and regular education aged 12-18 years old. This inventory concerned 14 key constructs from PEERS, e.g., areas of interests, locations for making friends, common peer groups and crowds inside and outside of school, activities with friends, commonly used ways for electronic communication, ways for handling disagreements, and common teasing comebacks. Also, 15 clinicians were involved in the translation and cultural adaptation process. The translation and cultural adaptation process were guided by the research team, and who included input and feedback from all stakeholders through an iterative feedback incorporation procedure. In phase 2, The parent-reported Social Responsiveness Scale (SRS), the Test of Adolescent Social Skills Knowledge (TASSK), and the Quality of Socialization Questionnaire (QSQ) were assessed pre- and post-intervention to evaluate potential treatment outcome. Results: The most striking cultural adaptation - reflecting the standpoints of all stakeholders - concerned the strategies for handling rumors and gossip, which were suggested to be taught using a similar approach as the teasing comebacks, more in line with ‘down-to-earth’ Dutch standards. The preliminary testing of this adapted version indicated that the adolescents with ASD significantly improved their social knowledge (TASSK; t₃₁ = -10.9, p < .01), social experience (QSQ-Parent; t₃₁ = -4.2, p < .01 and QSQ-Adolescent; t₃₂ = -3.8, p < .01), and in parent-reported social responsiveness (SRS; t₃₃ = 3.9, p < .01). In addition, subjective evaluations of teens with ASD, their parents and clinicians were positive. Conclusions: In order to further scrutinize the effectiveness of the Dutch version of the PEERS intervention, we recommended performing a larger scale randomized control trial (RCT) design, for which we provide several methodological considerations.Keywords: cultural adaptation, PEERS, preliminary testing, translation
Procedia PDF Downloads 168575 Using Interval Type-2 Fuzzy Controller for Diabetes Mellitus
Authors: Nafiseh Mollaei, Reihaneh Kardehi Moghaddam
Abstract:
In case of Diabetes Mellitus the controlling of insulin is very difficult. This illness is an incurable disease affecting millions of people worldwide. Glucose is a sugar which provides energy to the cells. Insulin is a hormone which supports the absorption of glucose. Fuzzy control strategy is attractive for glucose control because it mimics the first and second phase responses that the pancreas beta cells use to control glucose. We propose two control algorithms a type-1 fuzzy controller and an interval type-2 fuzzy method for the insulin infusion. The closed loop system has been simulated for different patients with different parameters, in present of the food intake disturbance and it has been shown that the blood glucose concentrations at a normoglycemic level of 110 mg/dl in the reasonable amount of time. This paper deals with type 1 diabetes as a nonlinear model, which has been simulated in MATLAB-SIMULINK environment. The novel model, termed the Augmented Minimal Model is used in the simulations. There are some uncertainties in this model due to factors such as blood glucose, daily meals or sudden stress. In addition to eliminate the effects of uncertainty, different control methods may be utilized. In this article, fuzzy controller performance were assessed in terms of its ability to track a normoglycemic set point (110 mg/dl) in response to a [0-10] g meal disturbance. Finally, the development reported in this paper is supposed to simplify the insulin delivery, so increasing the quality of life of the patient.Keywords: interval type-2, fuzzy controller, minimal augmented model, uncertainty
Procedia PDF Downloads 431574 Optimization of a Hand-Fan Shaped Microstrip Patch Antenna by Means of Orthogonal Design Method of Design of Experiments for L-Band and S-Band Applications
Authors: Jaswinder Kaur, Nitika, Navneet Kaur, Rajesh Khanna
Abstract:
A hand-fan shaped microstrip patch antenna (MPA) for L-band and S-band applications is designed, and its characteristics have been reconnoitered. The proposed microstrip patch antenna with double U-slot defected ground structure (DGS) is fabricated on an FR4 substrate which is a very readily available and inexpensive material. The suggested antenna is optimized using Orthogonal Design Method (ODM) of Design of Experiments (DOE) to cover the frequency range from 0.91-2.82 GHz for L-band and S-band applications. The L-band covers the frequency range of 1-2 GHz, which is allocated to telemetry, aeronautical, and military systems for passive satellite sensors, weather radars, radio astronomy, and mobile communication. The S-band covers the frequency range of 2-3 GHz, which is used by weather radars, surface ship radars and communication satellites and is also reserved for various wireless applications such as Worldwide Interoperability for Microwave Access (Wi-MAX), super high frequency radio frequency identification (SHF RFID), industrial, scientific and medical bands (ISM), Bluetooth, wireless broadband (Wi-Bro) and wireless local area network (WLAN). The proposed method of optimization is very time efficient and accurate as compared to the conventional evolutionary algorithms due to its statistical strategy. Moreover, the antenna is tested, followed by the comparison of simulated and measured results.Keywords: design of experiments, hand fan shaped MPA, L-Band, orthogonal design method, S-Band
Procedia PDF Downloads 134573 A Palmprint Identification System Based Multi-Layer Perceptron
Authors: David P. Tantua, Abdulkader Helwan
Abstract:
Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator
Procedia PDF Downloads 373572 Opacity Synthesis with Orwellian Observers
Authors: Moez Yeddes
Abstract:
The property of opacity is widely used in the formal verification of security in computer systems and protocols. Opacity is a general language-theoretic scheme of many security properties of systems. Opacity is parametrized with framework in which several security properties of a system can be expressed. A secret behaviour of a system is opaque if a passive attacker can never deduce its occurrence from the system observation. Instead of considering the case of static observability where the set of observable events is fixed off-line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we introduce Orwellian partial observability where unobservable events are not revealed provided that downgrading events never occurs in the future of the trace. Orwellian partial observability is needed to model intransitive information flow. This Orwellian observability is knwon as ipurge function. We show in previous work how to verify opacity for regular secret is opaque for a regular language L w.r.t. an Orwellian projection is PSPACE-complete while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. In this paper, we address two problems of opacification of a regular secret ϕ for a regular language L w.r.t. an Orwellian projection: Given L and a secret ϕ ∈ L, the first problem consist to compute some minimal regular super-language M of L, if it exists, such that ϕ is opaque for M and the second consists to compute the supremal sub-language M′ of L such that ϕ is opaque for M′. We derive both language-theoretic characterizations and algorithms to solve these two dual problems.Keywords: security policies, opacity, formal verification, orwellian observation
Procedia PDF Downloads 226