Search results for: artificial intelligence and genetic algorithms
3677 Postmortem Genetic Testing to Sudden and Unexpected Deaths Using the Next Generation Sequencing
Authors: Eriko Ochiai, Fumiko Satoh, Keiko Miyashita, Yu Kakimoto, Motoki Osawa
Abstract:
Sudden and unexpected deaths from unknown causes occur in infants and youths. Recently, molecular links between a part of these deaths and several genetic diseases are examined in the postmortem. For instance, hereditary long QT syndrome and Burgada syndrome are occasionally fatal through critical ventricular tachyarrhythmia. There are a large number of target genes responsible for such diseases, the conventional analysis using the Sanger’s method has been laborious. In this report, we attempted to analyze sudden deaths comprehensively using the next generation sequencing (NGS) technique. Multiplex PCR to subject’s DNA was performed using Ion AmpliSeq Library Kits 2.0 and Ion AmpliSeq Inherited Disease Panel (Life Technologies). After the library was constructed by emulsion PCR, the amplicons were sequenced 500 flows on Ion Personal Genome Machine System (Life Technologies) according to the manufacture instruction. SNPs and indels were analyzed to the sequence reads that were mapped on hg19 of reference sequences. This project has been approved by the ethical committee of Tokai University School of Medicine. As a representative case, the molecular analysis to a 40 years old male who received a diagnosis of Brugada syndrome demonstrated a total of 584 SNPs or indels. Non-synonymous and frameshift nucleotide substitutions were selected in the coding region of heart disease related genes of ANK2, AKAP9, CACNA1C, DSC2, KCNQ1, MYLK, SCN1B, and STARD3. In particular, c.629T-C transition in exon 3 of the SCN1B gene, resulting in a leu210-to-pro (L210P) substitution is predicted “damaging” by the SIFT program. Because the mutation has not been reported, it was unclear if the substitution was pathogenic. Sudden death that failed in determining the cause of death constitutes one of the most important unsolved subjects in forensic pathology. The Ion AmpliSeq Inherited Disease Panel can amplify the exons of 328 genes at one time. We realized the difficulty in selection of the true source from a number of candidates, but postmortem genetic testing using NGS analysis deserves of a diagnostic to date. We now extend this analysis to SIDS suspected subjects and young sudden death victims.Keywords: postmortem genetic testing, sudden death, SIDS, next generation sequencing
Procedia PDF Downloads 3593676 Signal Restoration Using Neural Network Based Equalizer for Nonlinear channels
Authors: Z. Zerdoumi, D. Benatia, , D. Chicouche
Abstract:
This paper investigates the application of artificial neural network to the problem of nonlinear channel equalization. The difficulties caused by channel distortions such as inter symbol interference (ISI) and nonlinearity can overcome by nonlinear equalizers employing neural networks. It has been shown that multilayer perceptron based equalizer outperform significantly linear equalizers. We present a multilayer perceptron based equalizer with decision feedback (MLP-DFE) trained with the back propagation algorithm. The capacity of the MLP-DFE to deal with nonlinear channels is evaluated. From simulation results it can be noted that the MLP based DFE improves significantly the restored signal quality, the steady state mean square error (MSE), and minimum Bit Error Rate (BER), when comparing with its conventional counterpart.Keywords: Artificial Neural Network, signal restoration, Nonlinear Channel equalization, equalization
Procedia PDF Downloads 4973675 Genomic Sequence Representation Learning: An Analysis of K-Mer Vector Embedding Dimensionality
Authors: James Jr. Mashiyane, Risuna Nkolele, Stephanie J. Müller, Gciniwe S. Dlamini, Rebone L. Meraba, Darlington S. Mapiye
Abstract:
When performing language tasks in natural language processing (NLP), the dimensionality of word embeddings is chosen either ad-hoc or is calculated by optimizing the Pairwise Inner Product (PIP) loss. The PIP loss is a metric that measures the dissimilarity between word embeddings, and it is obtained through matrix perturbation theory by utilizing the unitary invariance of word embeddings. Unlike in natural language, in genomics, especially in genome sequence processing, unlike in natural language processing, there is no notion of a “word,” but rather, there are sequence substrings of length k called k-mers. K-mers sizes matter, and they vary depending on the goal of the task at hand. The dimensionality of word embeddings in NLP has been studied using the matrix perturbation theory and the PIP loss. In this paper, the sufficiency and reliability of applying word-embedding algorithms to various genomic sequence datasets are investigated to understand the relationship between the k-mer size and their embedding dimension. This is completed by studying the scaling capability of three embedding algorithms, namely Latent Semantic analysis (LSA), Word2Vec, and Global Vectors (GloVe), with respect to the k-mer size. Utilising the PIP loss as a metric to train embeddings on different datasets, we also show that Word2Vec outperforms LSA and GloVe in accurate computing embeddings as both the k-mer size and vocabulary increase. Finally, the shortcomings of natural language processing embedding algorithms in performing genomic tasks are discussed.Keywords: word embeddings, k-mer embedding, dimensionality reduction
Procedia PDF Downloads 1383674 Properties of Sustainable Artificial Lightweight Aggregate
Authors: Wasan Ismail Khalil, Hisham Khalid Ahmed, Zainab Ali
Abstract:
Structural Lightweight Aggregate Concrete (SLWAC) has been developed in recent years because it reduces the dead load, cost, thermal conductivity and coefficient of thermal expansion of the structure. So SLWAC has the advantage of being a relatively green building material. Lightweight Aggregate (LWA) is either occurs as natural material such as pumice, scoria, etc. or as artificial material produced from different raw materials such as expanded shale, clay, slate, etc. The use of SLWAC in Iraq is limited due to the lack in natural LWA. The existence of Iraqi clay deposit with different types and characteristics leads to the idea of producing artificial expanded clay aggregate. The main aim in this work is to present of the properties of artificial LWA produced in the laboratory. Available local bentonite clay which occurs in the Western region of Iraq was used as raw material to produce the LWA. Sodium silicate as liquid industrial waste material from glass plant was mixed with bentonite clay in mix proportion 1:1 by weight. The manufacturing method of the lightweight aggregate including, preparation and mixing of clay and sodium silicate, burning of the mixture in the furnace at the temperature between 750-800˚C for two hours, and finally gradually cooling process. The produced LWA was then crushed to small pieces then screened on standard sieve series and prepared with grading which conforms to the specifications of LWA. The maximum aggregate size used in this investigation is 10 mm. The chemical composition and the physical properties of the produced LWA are investigated. The results indicate that the specific gravity of the produced LWA is 1.5 with the density of 543kg/m3 and water absorption of 20.7% which is in conformity with the international standard of LWA. Many trail mixes were carried out in order to produce LWAC containing the artificial LWA produced in this research. The selected mix proportion is 1:1.5:2 (cement: sand: aggregate) by weight with water to cement ratio of 0.45. The experimental results show that LWAC has oven dry density of 1720 kg/m3, water absorption of 8.5%, the thermal conductivity of 0.723 W/m.K and compressive strength of 23 N/mm2. The SLWAC produced in this research can be used in the construction of different thermal insulated buildings and masonry units. It can be concluded that the SLWA produced in this study contributes to sustainable development by, using industrial waste materials, conserving energy, enhancing the thermal and structural efficiency of concrete.Keywords: expanded clay, lightweight aggregate, structural lightweight aggregate concrete, sustainable
Procedia PDF Downloads 3283673 Documents Emotions Classification Model Based on TF-IDF Weighting Measure
Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees
Abstract:
Emotions classification of text documents is applied to reveal if the document expresses a determined emotion from its writer. As different supervised methods are previously used for emotion documents’ classification, in this research we present a novel model that supports the classification algorithms for more accurate results by the support of TF-IDF measure. Different experiments have been applied to reveal the applicability of the proposed model, the model succeeds in raising the accuracy percentage according to the determined metrics (precision, recall, and f-measure) based on applying the refinement of the lexicon, integration of lexicons using different perspectives, and applying the TF-IDF weighting measure over the classifying features. The proposed model has also been compared with other research to prove its competence in raising the results’ accuracy.Keywords: emotion detection, TF-IDF, WEKA tool, classification algorithms
Procedia PDF Downloads 4843672 Electroencephalogram Based Alzheimer Disease Classification using Machine and Deep Learning Methods
Authors: Carlos Roncero-Parra, Alfonso Parreño-Torres, Jorge Mateo Sotos, Alejandro L. Borja
Abstract:
In this research, different methods based on machine/deep learning algorithms are presented for the classification and diagnosis of patients with mental disorders such as alzheimer. For this purpose, the signals obtained from 32 unipolar electrodes identified by non-invasive EEG were examined, and their basic properties were obtained. More specifically, different well-known machine learning based classifiers have been used, i.e., support vector machine (SVM), Bayesian linear discriminant analysis (BLDA), decision tree (DT), Gaussian Naïve Bayes (GNB), K-nearest neighbor (KNN) and Convolutional Neural Network (CNN). A total of 668 patients from five different hospitals have been studied in the period from 2011 to 2021. The best accuracy is obtained was around 93 % in both ADM and ADA classifications. It can be concluded that such a classification will enable the training of algorithms that can be used to identify and classify different mental disorders with high accuracy.Keywords: alzheimer, machine learning, deep learning, EEG
Procedia PDF Downloads 1263671 Vibroacoustic Modulation with Chirp Signal
Authors: Dong Liu
Abstract:
By sending a high-frequency probe wave and a low-frequency pump wave to a specimen, the vibroacoustic method evaluates the defect’s severity according to the modulation index of the received signal. Many studies experimentally proved the significant sensitivity of the modulation index to the tiny contact type defect. However, it has also been found that the modulation index was highly affected by the frequency of probe or pump waves. Therefore, the chirp signal has been introduced to the VAM method since it can assess multiple frequencies in a relatively short time duration, so the robustness of the VAM method could be enhanced. Consequently, the signal processing method needs to be modified accordingly. Various studies utilized different algorithms or combinations of algorithms for processing the VAM signal method by chirp excitation. These signal process methods were compared and used for processing a VAM signal acquired from the steel samples.Keywords: vibroacoustic modulation, nonlinear acoustic modulation, nonlinear acoustic NDT&E, signal processing, structural health monitoring
Procedia PDF Downloads 993670 A Bio-Inspired Approach for Self-Managing Wireless Sensor and Actor Networks
Authors: Lyamine Guezouli, Kamel Barka, Zineb Seghir
Abstract:
Wireless sensor and actor networks (WSANs) present a research challenge for different practice areas. Researchers are trying to optimize the use of such networks through their research work. This optimization is done on certain criteria, such as improving energy efficiency, exploiting node heterogeneity, self-adaptability and self-configuration. In this article, we present our proposal for BIFSA (Biologically-Inspired Framework for Wireless Sensor and Actor networks). Indeed, BIFSA is a middleware that addresses the key issues of wireless sensor and actor networks. BIFSA consists of two types of agents: sensor agents (SA) that operate at the sensor level to collect and transport data to actors and actor agents (AA) that operate at the actor level to transport data to base stations. Once the sensor agent arrives at the actor, it becomes an actor agent, which can exploit the resources of the actors and vice versa. BIFSA allows agents to evolve their genetic structures and adapt to the current network conditions. The simulation results show that BIFSA allows the agents to make better use of all the resources available in each type of node, which improves the performance of the network.Keywords: wireless sensor and actor networks, self-management, genetic algorithm, agent.
Procedia PDF Downloads 893669 Combining Mobile Intelligence with Formation Mechanism for Group Commerce
Authors: Lien Fa Lin, Yung Ming Li, Hsin Chen Hsieh
Abstract:
The rise of smartphones brings new concept So-Lo-Mo (social-local-mobile) in mobile commerce area in recent years. However, current So-Lo-Mo services only focus on individual users but not a group of users, and the development of group commerce is not enough to satisfy the demand of real-time group buying and less to think about the social relationship between customers. In this research, we integrate mobile intelligence with group commerce and consider customers' preference, real-time context, and social influence as components in the mechanism. With the support of this mechanism, customers are able to gather near customers with the same potential purchase willingness through mobile devices when he/she wants to purchase products or services to have a real-time group-buying. By matching the demand and supply of mobile group-buying market, this research improves the business value of mobile commerce and group commerce further.Keywords: group formation, group commerce, mobile commerce, So-Lo-Mo, social influence
Procedia PDF Downloads 4153668 Genetically Engineered Crops: Solution for Biotic and Abiotic Stresses in Crop Production
Authors: Deepak Loura
Abstract:
Production and productivity of several crops in the country continue to be adversely affected by biotic (e.g., Insect-pests and diseases) and abiotic (e.g., water temperature and salinity) stresses. Over-dependence on pesticides and other chemicals is economically non-viable for the resource-poor farmers of our country. Further, pesticides can potentially affect human and environmental safety. While traditional breeding techniques and proper- management strategies continue to play a vital role in crop improvement, we need to judiciously use biotechnology approaches for the development of genetically modified crops addressing critical problems in the improvement of crop plants for sustainable agriculture. Modern biotechnology can help to increase crop production, reduce farming costs, and improve food quality and the safety of the environment. Genetic engineering is a new technology which allows plant breeders to produce plants with new gene combinations by genetic transformation of crop plants for improvement of agronomic traits. Advances in recombinant DNA technology have made it possible to have genes between widely divergent species to develop genetically modified or genetically engineered plants. Plant genetic engineering provides the strength to harness useful genes and alleles from indigenous microorganisms to enrich the gene pool for developing genetically modified (GM) crops that will have inbuilt (inherent) resistance to insect pests, diseases, and abiotic stresses. Plant biotechnology has made significant contributions in the past 20 years in the development of genetically engineered or genetically modified crops with multiple benefits. A variety of traits have been introduced in genetically engineered crops which include (i) herbicide resistance. (ii) pest resistance, (iii) viral resistance, (iv) slow ripening of fruits and vegetables, (v) fungal and bacterial resistance, (vi) abiotic stress tolerance (drought, salinity, temperature, flooding, etc.). (vii) quality improvement (starch, protein, and oil), (viii) value addition (vitamins, micro, and macro elements), (ix) pharmaceutical and therapeutic proteins, and (x) edible vaccines, etc. Multiple genes in transgenic crops can be useful in developing durable disease resistance and a broad insect-control spectrum and could lead to potential cost-saving advantages for farmers. The development of transgenic to produce high-value pharmaceuticals and the edible vaccine is also under progress, which requires much more research and development work before commercially viable products will be available. In addition, molecular-aided selection (MAS) is now routinely used to enhance the speed and precision of plant breeding. Newer technologies need to be developed and deployed for enhancing and sustaining agricultural productivity. There is a need to optimize the use of biotechnology in conjunction with conventional technologies to achieve higher productivity with fewer resources. Therefore, genetic modification/ engineering of crop plants assumes greater importance, which demands the development and adoption of newer technology for the genetic improvement of crops for increasing crop productivity.Keywords: biotechnology, plant genetic engineering, genetically modified, biotic, abiotic, disease resistance
Procedia PDF Downloads 713667 Relating Symptoms with Protein Production Abnormality in Patients with Down Syndrome
Authors: Ruolan Zhou
Abstract:
Trisomy of human chromosome 21 is the primary cause of Down Syndrome (DS), and this genetic disease has significantly burdened families and countries, causing great controversy. To address this problem, the research takes an approach in exploring the relationship between genetic abnormality and this disease's symptoms, adopting several techniques, including data analysis and enrichment analysis. It also explores open-source websites, such as NCBI, DAVID, SOURCE, STRING, as well as UCSC, to complement its result. This research has analyzed the variety of genes on human chromosome 21 with simple coding, and by using analysis, it has specified the protein-coding genes, their function, and their location. By using enrichment analysis, this paper has found the abundance of keratin production-related coding-proteins on human chromosome 21. By adopting past researches, this research has attempted to disclose the relationship between trisomy of human chromosome 21 and keratin production abnormality, which might be the reason for common diseases in patients with Down Syndrome. At last, by addressing the advantage and insufficiency of this research, the discussion has provided specific directions for future research.Keywords: Down Syndrome, protein production, genome, enrichment analysis
Procedia PDF Downloads 1263666 Gray Level Image Encryption
Authors: Roza Afarin, Saeed Mozaffari
Abstract:
The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.Keywords: correlation coefficients, genetic algorithm, image encryption, image entropy
Procedia PDF Downloads 3303665 Using Machine Learning as an Alternative for Predicting Exchange Rates
Authors: Pedro Paulo Galindo Francisco, Eli Dhadad Junior
Abstract:
This study addresses the Meese-Rogoff Puzzle by introducing the latest machine learning techniques as alternatives for predicting the exchange rates. Using RMSE as a comparison metric, Meese and Rogoff discovered that economic models are unable to outperform the random walk model as short-term exchange rate predictors. Decades after this study, no statistical prediction technique has proven effective in overcoming this obstacle; although there were positive results, they did not apply to all currencies and defined periods. Recent advancements in artificial intelligence technologies have paved the way for a new approach to exchange rate prediction. Leveraging this technology, we applied five machine learning techniques to attempt to overcome the Meese-Rogoff puzzle. We considered daily data for the real, yen, British pound, euro, and Chinese yuan against the US dollar over a time horizon from 2010 to 2023. Our results showed that none of the presented techniques were able to produce an RMSE lower than the Random Walk model. However, the performance of some models, particularly LSTM and N-BEATS were able to outperform the ARIMA model. The results also suggest that machine learning models have untapped potential and could represent an effective long-term possibility for overcoming the Meese-Rogoff puzzle.Keywords: exchage rate, prediction, machine learning, deep learning
Procedia PDF Downloads 323664 Phylogeographic Reconstruction of the Tiger Shrimp (Penaeus monodon) Invasion in the Atlantic Ocean: The Role of the Farming Systems in the Marine Biological Invasions
Authors: Juan Carlos Aguirre Pabon, Stephen Sabatino, James Morris, Khor Waiho, Antonio Murias
Abstract:
The tiger shrimp Penaeus monodon is one of the most important species in aquaculture and is native to the Indo-Pacific Ocean. During its greatest success in world production (70s and 80s) was introduced in many Atlantic Ocean countries for cultivation purposes and is currently reported as established in several countries of this area. Because there are no studies to understand the magnitude of the invasion process, this is an exciting opportunity to test evolutionary hypotheses in the context of marine invasions mediated by culture systems; therefore, the purpose of this study was to reconstruct the scenario of invasion of P. monodon in the Atlantic Ocean, by using mitochondrial DNA and eight loci microsatellites. In addition, samples of the invasion area in the Atlantic Ocean (US, Colombia, Venezuela, Brazil, Guienne Bissau, Senegal), the Indo-Pacific Ocean (Indonesia, India, Mozambique), and some cultivation systems (India, Bangladesh, Madagascar) were collected; and analysis of phylogenetic relationships (using some species of the family), genetic diversity, structure population, and demographic changes were performed. High intraspecific divergence in P. semisulcatus and P. monodon were found, high genetic variability in all sites (especially with microsatellites) and the presence of three clusters or populations. In addition, signs of demographic expansion in the culture population and bottlenecks in the invasive and native populations were found, as well as evidence of gene mixtures from all of the populations studied, implying that cropping systems play an essential role in mitigating the negative effects of the founder effect and providing a source of genetic variability that can ensure the success of the invasion.Keywords: species introduction, increased variability, demographic changes, promoting invasion.
Procedia PDF Downloads 513663 Modelling Mode Choice Behaviour Using Cloud Theory
Authors: Leah Wright, Trevor Townsend
Abstract:
Mode choice models are crucial instruments in the analysis of travel behaviour. These models show the relationship between an individual’s choice of transportation mode for a given O-D pair and the individual’s socioeconomic characteristics such as household size and income level, age and/or gender, and the features of the transportation system. The most popular functional forms of these models are based on Utility-Based Choice Theory, which addresses the uncertainty in the decision-making process with the use of an error term. However, with the development of artificial intelligence, many researchers have started to take a different approach to travel demand modelling. In recent times, researchers have looked at using neural networks, fuzzy logic and rough set theory to develop improved mode choice formulas. The concept of cloud theory has recently been introduced to model decision-making under uncertainty. Unlike the previously mentioned theories, cloud theory recognises a relationship between randomness and fuzziness, two of the most common types of uncertainty. This research aims to investigate the use of cloud theory in mode choice models. This paper highlights the conceptual framework of the mode choice model using cloud theory. Merging decision-making under uncertainty and mode choice models is state of the art. The cloud theory model is expected to address the issues and concerns with the nested logit and improve the design of mode choice models and their use in travel demand.Keywords: Cloud theory, decision-making, mode choice models, travel behaviour, uncertainty
Procedia PDF Downloads 3883662 EEG-Based Screening Tool for School Student’s Brain Disorders Using Machine Learning Algorithms
Authors: Abdelrahman A. Ramzy, Bassel S. Abdallah, Mohamed E. Bahgat, Sarah M. Abdelkader, Sherif H. ElGohary
Abstract:
Attention-Deficit/Hyperactivity Disorder (ADHD), epilepsy, and autism affect millions of children worldwide, many of which are undiagnosed despite the fact that all of these disorders are detectable in early childhood. Late diagnosis can cause severe problems due to the late treatment and to the misconceptions and lack of awareness as a whole towards these disorders. Moreover, electroencephalography (EEG) has played a vital role in the assessment of neural function in children. Therefore, quantitative EEG measurement will be utilized as a tool for use in the evaluation of patients who may have ADHD, epilepsy, and autism. We propose a screening tool that uses EEG signals and machine learning algorithms to detect these disorders at an early age in an automated manner. The proposed classifiers used with epilepsy as a step taken for the work done so far, provided an accuracy of approximately 97% using SVM, Naïve Bayes and Decision tree, while 98% using KNN, which gives hope for the work yet to be conducted.Keywords: ADHD, autism, epilepsy, EEG, SVM
Procedia PDF Downloads 1903661 Using Computer Vision to Detect and Localize Fractures in Wrist X-ray Images
Authors: John Paul Q. Tomas, Mark Wilson L. de los Reyes, Kirsten Joyce P. Vasquez
Abstract:
The most frequent type of fracture is a wrist fracture, which often makes it difficult for medical professionals to find and locate. In this study, fractures in wrist x-ray pictures were located and identified using deep learning and computer vision. The researchers used image filtering, masking, morphological operations, and data augmentation for the image preprocessing and trained the RetinaNet and Faster R-CNN models with ResNet50 backbones and Adam optimizers separately for each image filtering technique and projection. The RetinaNet model with Anisotropic Diffusion Smoothing filter trained with 50 epochs has obtained the greatest accuracy of 99.14%, precision of 100%, sensitivity/recall of 98.41%, specificity of 100%, and an IoU score of 56.44% for the Posteroanterior projection utilizing augmented data. For the Lateral projection using augmented data, the RetinaNet model with an Anisotropic Diffusion filter trained with 50 epochs has produced the highest accuracy of 98.40%, precision of 98.36%, sensitivity/recall of 98.36%, specificity of 98.43%, and an IoU score of 58.69%. When comparing the test results of the different individual projections, models, and image filtering techniques, the Anisotropic Diffusion filter trained with 50 epochs has produced the best classification and regression scores for both projections.Keywords: Artificial Intelligence, Computer Vision, Wrist Fracture, Deep Learning
Procedia PDF Downloads 733660 Hybrid Hierarchical Clustering Approach for Community Detection in Social Network
Authors: Radhia Toujani, Jalel Akaichi
Abstract:
Social Networks generally present a hierarchy of communities. To determine these communities and the relationship between them, detection algorithms should be applied. Most of the existing algorithms, proposed for hierarchical communities identification, are based on either agglomerative clustering or divisive clustering. In this paper, we present a hybrid hierarchical clustering approach for community detection based on both bottom-up and bottom-down clustering. Obviously, our approach provides more relevant community structure than hierarchical method which considers only divisive or agglomerative clustering to identify communities. Moreover, we performed some comparative experiments to enhance the quality of the clustering results and to show the effectiveness of our algorithm.Keywords: agglomerative hierarchical clustering, community structure, divisive hierarchical clustering, hybrid hierarchical clustering, opinion mining, social network, social network analysis
Procedia PDF Downloads 3653659 Intelligent Chatbot Generating Dynamic Responses Through Natural Language Processing
Authors: Aarnav Singh, Jatin Moolchandani
Abstract:
The proposed research work aims to build a query-based AI chatbot that can answer any question related to any topic. A chatbot is software that converses with users via text messages. In the proposed system, we aim to build a chatbot that generates a response based on the user’s query. For this, we use natural language processing to analyze the query and some set of texts to form a concise answer. The texts are obtained through web-scrapping and filtering all the credible sources from a web search. The objective of this project is to provide a chatbot that is able to provide simple and accurate answers without the user having to read through a large number of articles and websites. Creating an AI chatbot that can answer a variety of user questions on a variety of topics is the goal of the proposed research project. This chatbot uses natural language processing to comprehend user inquiries and provides succinct responses by examining a collection of writings that were scraped from the internet. The texts are carefully selected from reliable websites that are found via internet searches. This project aims to provide users with a chatbot that provides clear and precise responses, removing the need to go through several articles and web pages in great detail. In addition to exploring the reasons for their broad acceptance and their usefulness across many industries, this article offers an overview of the interest in chatbots throughout the world.Keywords: Chatbot, Artificial Intelligence, natural language processing, web scrapping
Procedia PDF Downloads 663658 Data-Driven Monitoring and Control of Water Sanitation and Hygiene for Improved Maternal Health in Rural Communities
Authors: Paul Barasa Wanyama, Tom Wanyama
Abstract:
Governments and development partners in low-income countries often prioritize building Water Sanitation and Hygiene (WaSH) infrastructure of healthcare facilities to improve maternal healthcare outcomes. However, the operation, maintenance, and utilization of this infrastructure are almost never considered. Many healthcare facilities in these countries use untreated water that is not monitored for quality or quantity. Consequently, it is common to run out of water while a patient is on their way to or in the operating theater. Further, the handwashing stations in healthcare facilities regularly run out of water or soap for months, and the latrines are typically not clean, in part due to the lack of water. In this paper, we present a system that uses Internet of Things (IoT), big data, cloud computing, and AI to initiate WaSH security in healthcare facilities, with a specific focus on maternal health. We have implemented smart sensors and actuators to monitor and control WaSH systems from afar to ensure their objectives are achieved. We have also developed a cloud-based system to analyze WaSH data in real time and communicate relevant information back to the healthcare facilities and their stakeholders (e.g., medical personnel, NGOs, ministry of health officials, facilities managers, community leaders, pregnant women, and new mothers and their families) to avert or mitigate problems before they occur.Keywords: WaSH, internet of things, artificial intelligence, maternal health, rural communities, healthcare facilities
Procedia PDF Downloads 213657 Internet of Things: Route Search Optimization Applying Ant Colony Algorithm and Theory of Computer Science
Authors: Tushar Bhardwaj
Abstract:
Internet of Things (IoT) possesses a dynamic network where the network nodes (mobile devices) are added and removed constantly and randomly, hence the traffic distribution in the network is quite variable and irregular. The basic but very important part in any network is route searching. We have many conventional route searching algorithms like link-state, and distance vector algorithms but they are restricted to the static point to point network topology. In this paper we propose a model that uses the Ant Colony Algorithm for route searching. It is dynamic in nature and has positive feedback mechanism that conforms to the route searching. We have also embedded the concept of Non-Deterministic Finite Automata [NDFA] minimization to reduce the network to increase the performance. Results show that Ant Colony Algorithm gives the shortest path from the source to destination node and NDFA minimization reduces the broadcasting storm effectively.Keywords: routing, ant colony algorithm, NDFA, IoT
Procedia PDF Downloads 4443656 The Assessment of Bilingual Students: How Bilingual Can It Really Be?
Authors: Serge Lacroix
Abstract:
The proposed study looks at the psychoeducational assessment of bilingual students, in English and French in this case. It will be the opportunity to look at language of assessment and specifically how certain tests can be administered in one language and others in another language. It is also a look into the questioning of the validity of the test scores that are obtained as well as the quality and generalizability of the conclusions that can be drawn. Bilingualism and multiculturalism, although in constant expansion, is not considered in norms development and remains a poorly understood factor when it is at play in the context of a psychoeducational assessment. Student placement, diagnoses, accurate measures of intelligence and achievement are all impacted by the quality of the assessment procedure. The same is true for questionnaires administered to parents and self-reports completed by bilingual students who, more often than not, are assessed in a language that is not their primary one or are compared to monolinguals not dealing with the same challenges or the same skills. Results show that students, when offered to work in a bilingual fashion, chooses to do so in a significant proportion. Recommendations will be offered to support educators aiming at expanding their skills when confronted with multilingual students in an assessment context.Keywords: psychoeducational assessment, bilingualism, multiculturalism, intelligence, achievement
Procedia PDF Downloads 4553655 Optimisation of Intermodal Transport Chain of Supermarkets on Isle of Wight, UK
Authors: Jingya Liu, Yue Wu, Jiabin Luo
Abstract:
This work investigates an intermodal transportation system for delivering goods from a Regional Distribution Centre to supermarkets on the Isle of Wight (IOW) via the port of Southampton or Portsmouth in the UK. We consider this integrated logistics chain as a 3-echelon transportation system. In such a system, there are two types of transport methods used to deliver goods across the Solent Channel: one is accompanied transport, which is used by most supermarkets on the IOW, such as Spar, Lidl and Co-operative food; the other is unaccompanied transport, which is used by Aldi. Five transport scenarios are studied based on different transport modes and ferry routes. The aim is to determine an optimal delivery plan for supermarkets of different business scales on IOW, in order to minimise the total running cost, fuel consumptions and carbon emissions. The problem is modelled as a vehicle routing problem with time windows and solved by genetic algorithm. The computing results suggested that accompanied transport is more cost efficient for small and medium business-scale supermarket chains on IOW, while unaccompanied transport has the potential to improve the efficiency and effectiveness of large business scale supermarket chains.Keywords: genetic algorithm, intermodal transport system, Isle of Wight, optimization, supermarket
Procedia PDF Downloads 3693654 Rheological Evaluation of a Mucoadhesive Precursor of Based-Poloxamer 407 or Polyethylenimine Liquid Crystal System for Buccal Administration
Authors: Jéssica Bernegossi, Lívia Nordi Dovigo, Marlus Chorilli
Abstract:
Mucoadhesive liquid crystalline systems are emerging how delivery systems for oral cavity. These systems are interesting since they facilitate the targeting of medicines and change the release enabling a reduction in the number of applications made by the patient. The buccal mucosa is permeable besides present a great blood supply and absence of first pass metabolism, it is a good route of administration. It was developed two systems liquid crystals utilizing as surfactant the ethyl alcohol ethoxylated and propoxylated (30%) as oil phase the oleic acid (60%), and the aqueous phase (10%) dispersion of polymer polyethylenimine (0.5%) or dispersion of polymer poloxamer 407 (16%), with the intention of applying the buccal mucosa. Initially, was performed for characterization of systems the conference by polarized light microscopy and rheological analysis. For the preparation of the systems the components described was added above in glass vials and shaken. Then, 30 and 100% artificial saliva were added to each prepared formulation so as to simulate the environment of the oral cavity. For the verification of the system structure, aliquots of the formulations were observed in glass slide and covered with a coverslip, examined in polarized light microscope (PLM) Axioskop - Zeizz® in 40x magnifier. The formulations were also evaluated for their rheological profile Rheometer TA Instruments®, which were obtained rheograms the selected systems employing fluency mode (flow) in temperature of 37ºC (98.6ºF). In PLM, it was observed that in formulations containing polyethylenimine and poloxamer 407 without the addition of artificial saliva was observed dark-field being indicative of microemulsion, this was also observed with the formulation that was increased with 30% of the artificial saliva. In the formulation that was increased with 100% simulated saliva was shown to be a system structure since it presented anisotropy with the presence of striae being indicative of hexagonal liquid crystalline mesophase system. Upon observation of rheograms, both systems without the addition of artificial saliva showed a Newtonian profile, after addition of 30% artificial saliva have been given a non-Newtonian behavior of the pseudoplastic-thixotropic type and after adding 100% of the saliva artificial proved plastic-thixotropic. Furthermore, it is clearly seen that the formulations containing poloxamer 407 have significantly larger (15-800 Pa) shear stress compared to those containing polyethyleneimine (5-50 Pa), indicating a greater plasticity of these. Thus, it is possible to observe that the addition of saliva was of interest to the system structure, starting from a microemulsion for a liquid crystal system, thereby also changing thereby its rheological behavior. The systems have promising characteristics as controlled release systems to the oral cavity, as it features good fluidity during its possible application and greater structuring of the system when it comes into contact with environmental saliva.Keywords: liquid crystal system, poloxamer 407, polyethylenimine, rheology
Procedia PDF Downloads 4583653 AI-Driven Forecasting Models for Anticipating Oil Market Trends and Demand
Authors: Gaurav Kumar Sinha
Abstract:
The volatility of the oil market, influenced by geopolitical, economic, and environmental factors, presents significant challenges for stakeholders in predicting trends and demand. This article explores the application of artificial intelligence (AI) in developing robust forecasting models to anticipate changes in the oil market more accurately. We delve into various AI techniques, including machine learning, deep learning, and time series analysis, that have been adapted to analyze historical data and current market conditions to forecast future trends. The study evaluates the effectiveness of these models in capturing complex patterns and dependencies in market data, which traditional forecasting methods often miss. Additionally, the paper discusses the integration of external variables such as political events, economic policies, and technological advancements that influence oil prices and demand. By leveraging AI, stakeholders can achieve a more nuanced understanding of market dynamics, enabling better strategic planning and risk management. The article concludes with a discussion on the potential of AI-driven models in enhancing the predictive accuracy of oil market forecasts and their implications for global economic planning and strategic resource allocation.Keywords: AI forecasting, oil market trends, machine learning, deep learning, time series analysis, predictive analytics, economic factors, geopolitical influence, technological advancements, strategic planning
Procedia PDF Downloads 353652 RNA-seq Analysis of Liver from NASH-HCC Model Mouse Treated with Streptozotocin-High Fat Diet
Authors: Bui Phuong Linh, Yuki Sakakibara, Ryuto Tanaka, Elizabeth H. Pigney, Taishi Hashiguchi
Abstract:
Non-alcoholic steatohepatitis (NASH) is a chronic liver disease, often associated with type II diabetes, which sometimes progresses to more serious conditions such as liver fibrosis and hepatocellular carcinoma (HCC). NASH has become an important health problem worldwide, buttherapeutic agents for NASH have not yet been approved, and animal models with high clinical correlation are required. TheSTAM™ mouse shows the same pathological progression as human NASH patients and has been widely used for both drug efficacy and basic research, such as lipid profiling and gut microbiota research. In this study, we analyzed the RNA-seq data of STAM™mice at each pathological stage (steatosis, steatohepatitis, liver fibrosis, and HCC) and examined the clinical correlation at the genetic level. NASH was induced in male mice by a single subcutaneous injection of 200 µg streptozotocin solution 2 days after birth and feeding with high fat dietafter 4 weeks of age. The mice were sacrificed and livers collected at 6, 8, 10, 12, 16, and 20 weeks of age. For liver samples, the left lateral lobe was snap frozen in liquid nitrogen and stored at -80˚C for RNA-seq analysis. Total RNA of the cells was isolated using RNeasy mini kit. The gene expression of the canonical pathways in NASH progression from steatosis to hepatocellular carcinoma were analyzed, such as immune system process, oxidation-reduction process, lipid metabolic process. Moreover, since it has been reported that genetic traits are involved in the development of NASH-HCC, we next analyzed the genetic mutations in the STAM™mice. The number of individuals showing mutations in Mtorinvolved in Insulin signaling increases as the disease progresses, especially in the liver cancer phase. These results indicated a clinical correlation of gene profiles in the STAM™mouse.Keywords: steatosis, non-alcoholic steatohepatitis, fibrosis, hepatocellular carcinoma, RNA-seq
Procedia PDF Downloads 1563651 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory
Authors: Damir Latypov
Abstract:
A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory
Procedia PDF Downloads 1553650 Toxicity of Cry1ac Bacillus thuringiensis against Helicoverpa armigera (Hubner) on Artificial Diet under Laboratory Conditions
Authors: Tahammal Hussain, Khuram Zia, Mumammad Jalal Arif, Megha Parajulee, Abdul Hakeem
Abstract:
The Bioassay on neonate, 2nd and 3rd instar larvae of Helicoverpa armigera (Hubner) were conducted against Bacillus thuringiensis proteins Cry1Ac. Cry1Ac was incorporated into an artificial diet and was serially diluted with distilled water and then mixed with diet at an appropriate temperature of diet. Toxins incorporated prepared diet was poured into Petri-dishes. For controls, distilled water was mixed with the diet. Five toxin doses 0.25, 0.5, 1, 2, and 4 ug / ml and one control were used for each instars of H. armigera 20 larvae were used in each replication and each treatment is replicated four times. LC50 of Cry1Ac against neonate, 2nd and 3rd instar larvae of H. armigera were 0.34, 0.81 and 1.46 ug / ml. So Cry1Ac is more effective against neonate larvae of H .armigera as compared to 2nd and 3rd instar larvae under laboratory conditions.Keywords: B. thuringiensis, Cry1Ac, H. armigera, toxicity
Procedia PDF Downloads 4133649 Correlative Look at Relationship between Emotional Intelligence and Effective Crisis Management in Context of Covid-19 in France and Canada
Authors: Brittany Duboz-Quinville
Abstract:
Emotional Intelligence (EI) is a growing field, and many studies are examining how it pertains to the workplace. In the context of crisis management several studies have postulated that EI could play a role in individuals’ ability to execute crisis plans. However, research evaluating the EI of leaders who have actually managed a crisis is still lacking. The COVID-19 pandemic forced many businesses into a crisis situation beginning in March and April of 2020. This study sought to measure both EI and effective crisis management (CM) during the COVID-19 pandemic to determine if they were positively correlated. A quantitative survey was distributed via the internet that comprised of 15 EI statements, and 15 CM statements with Likert scale responses, and 6 demographic questions with discrete responses. The hypothesis of the study was: it is believed that EI correlates positively with effective crisis management. The results of the study did not support the studies hypothesis as the correlation between EI and CM was not statistically significant. An additional correlation was tested, comparing employees’ perception of their superiors’ EI (Perception) to employees’ opinion of how their superiors managed the crisis (Opinion). This Opinion and Perception correlation was statistically significant. Furthermore, by examining this correlation through demographic divisions there are additional significant results, notably that French speaking employees have a stronger Opinion/Perception correlation than English speaking employees. Implications for cultural differences in EI and CM are discussed as well as possible differences across job sectors. Finally, it is hoped that this study will serve to convince more companies, particularly in France, to embrace EI training for staff and especially managers.Keywords: crisis management, emotional intelligence, empathy, management training
Procedia PDF Downloads 1663648 Recovery of the Demolition and Construction Waste, Casablanca (Morocco)
Authors: Morsli Mourad, Tahiri Mohamed, Samdi Azzeddine
Abstract:
Casablanca is the biggest city in Morocco. It concentrates more than 60% of the economic and industrial activity of the kingdom. Its building and public works (BTP) sector is the leading source of inert waste scattered in open areas. This inert waste is a major challenge for the city of Casablanca, as it is not properly managed, thus causing a significant nuisance for the environment and the health of the population. Hence the vision of our project is to recycle and valorize concrete waste. In this work, we present concrete results in the exploitation of this abundant and permanent deposit. Typical wastes are concrete, clay and concrete bricks, ceramic tiles, marble panels, gypsum, scrap metal, wood . The work performed included: geolocation with a combination of artificial intelligence and Google Earth, estimation of the amount of waste per site, sorting, crushing, grinding, and physicochemical characterization of the samples. Then, we proceeded to the exploitation of the types of substrates to be developed: light cement, coating, and glue for ceramics... The said products were tested and characterized by X-ray fluorescence, specific surface, resistance to bending and crushing, etc. We will present in detail the main results of our research work and also describe the specific properties of each material developed.Keywords: déchets de démolition et des chantiers de construction, logiciels de combinaison SIG, valorisation de déchets inertes, enduits, ciment leger, casablanca
Procedia PDF Downloads 112