Search results for: Rule Based Architecture
26377 Enhanced Arabic Semantic Information Retrieval System Based on Arabic Text Classification
Authors: A. Elsehemy, M. Abdeen , T. Nazmy
Abstract:
Since the appearance of the Semantic web, many semantic search techniques and models were proposed to exploit the information in ontology to enhance the traditional keyword-based search. Many advances were made in languages such as English, German, French and Spanish. However, other languages such as Arabic are not fully supported yet. In this paper we present a framework for ontology based information retrieval for Arabic language. Our system consists of four main modules, namely query parser, indexer, search and a ranking module. Our approach includes building a semantic index by linking ontology concepts to documents, including an annotation weight for each link, to be used in ranking the results. We also augmented the framework with an automatic document categorizer, which enhances the overall document ranking. We have built three Arabic domain ontologies: Sports, Economic and Politics as example for the Arabic language. We built a knowledge base that consists of 79 classes and more than 1456 instances. The system is evaluated using the precision and recall metrics. We have done many retrieval operations on a sample of 40,316 documents with a size 320 MB of pure text. The results show that the semantic search enhanced with text classification gives better performance results than the system without classification.Keywords: Arabic text classification, ontology based retrieval, Arabic semantic web, information retrieval, Arabic ontology
Procedia PDF Downloads 52726376 Developing a DNN Model for the Production of Biogas From a Hybrid BO-TPE System in an Anaerobic Wastewater Treatment Plant
Authors: Hadjer Sadoune, Liza Lamini, Scherazade Krim, Amel Djouadi, Rachida Rihani
Abstract:
Deep neural networks are highly regarded for their accuracy in predicting intricate fermentation processes. Their ability to learn from a large amount of datasets through artificial intelligence makes them particularly effective models. The primary obstacle in improving the performance of these models is to carefully choose the suitable hyperparameters, including the neural network architecture (number of hidden layers and hidden units), activation function, optimizer, learning rate, and other relevant factors. This study predicts biogas production from real wastewater treatment plant data using a sophisticated approach: hybrid Bayesian optimization with a tree-structured Parzen estimator (BO-TPE) for an optimised deep neural network (DNN) model. The plant utilizes an Upflow Anaerobic Sludge Blanket (UASB) digester that treats industrial wastewater from soft drinks and breweries. The digester has a working volume of 1574 m3 and a total volume of 1914 m3. Its internal diameter and height were 19 and 7.14 m, respectively. The data preprocessing was conducted with meticulous attention to preserving data quality while avoiding data reduction. Three normalization techniques were applied to the pre-processed data (MinMaxScaler, RobustScaler and StandardScaler) and compared with the Non-Normalized data. The RobustScaler approach has strong predictive ability for estimating the volume of biogas produced. The highest predicted biogas volume was 2236.105 Nm³/d, with coefficient of determination (R2), mean absolute error (MAE), and root mean square error (RMSE) values of 0.712, 164.610, and 223.429, respectively.Keywords: anaerobic digestion, biogas production, deep neural network, hybrid bo-tpe, hyperparameters tuning
Procedia PDF Downloads 4026375 Examining of Tool Wear in Cryogenic Machining of Cobalt-Based Haynes 25 Superalloy
Authors: Murat Sarıkaya, Abdulkadir Güllü
Abstract:
Haynes 25 alloy (also known as L-605 alloy) is cobalt based super alloy which has widely applications such as aerospace industry, turbine and furnace parts, power generators and heat exchangers and petroleum refining components due to its excellent characteristics. However, the workability of this alloy is more difficult compared to normal steels or even stainless. In present work, an experimental investigation was performed under cryogenic cooling to determine cutting tool wear patterns and obtain optimal cutting parameters in turning of cobalt based superalloy Haynes 25. In experiments, uncoated carbide tool was used and cutting speed (V) and feed rate (f) were considered as test parameters. Tool wear (VBmax) were measured for process performance indicators. Analysis of variance (ANOVA) was performed to determine the importance of machining parameters.Keywords: cryogenic machining, difficult-to-cut alloy, tool wear, turning
Procedia PDF Downloads 59326374 Linguistic Features for Sentence Difficulty Prediction in Aspect-Based Sentiment Analysis
Authors: Adrian-Gabriel Chifu, Sebastien Fournier
Abstract:
One of the challenges of natural language understanding is to deal with the subjectivity of sentences, which may express opinions and emotions that add layers of complexity and nuance. Sentiment analysis is a field that aims to extract and analyze these subjective elements from text, and it can be applied at different levels of granularity, such as document, paragraph, sentence, or aspect. Aspect-based sentiment analysis is a well-studied topic with many available data sets and models. However, there is no clear definition of what makes a sentence difficult for aspect-based sentiment analysis. In this paper, we explore this question by conducting an experiment with three data sets: ”Laptops”, ”Restaurants”, and ”MTSC” (Multi-Target-dependent Sentiment Classification), and a merged version of these three datasets. We study the impact of domain diversity and syntactic diversity on difficulty. We use a combination of classifiers to identify the most difficult sentences and analyze their characteristics. We employ two ways of defining sentence difficulty. The first one is binary and labels a sentence as difficult if the classifiers fail to correctly predict the sentiment polarity. The second one is a six-level scale based on how many of the top five best-performing classifiers can correctly predict the sentiment polarity. We also define 9 linguistic features that, combined, aim at estimating the difficulty at sentence level.Keywords: sentiment analysis, difficulty, classification, machine learning
Procedia PDF Downloads 9326373 Utilization of an Object Oriented Tool to Perform Model-Based Safety Analysis According to Extended Failure System Models
Authors: Royia Soliman, Salma ElAnsary, Akram Amin Abdellatif, Florian Holzapfel
Abstract:
Model-Based Safety Analysis (MBSA) is an approach in which the system and safety engineers share a common system model created using a model-based development process. The model can also be extended by the failure modes of the system components. There are two famous approaches for the addition of fault behaviors to system models. The first one is to enclose the failure into the system design directly. The second approach is to develop a fault model separately from the system model, thus combining both independent models for safety analysis. This paper introduces a hybrid approach of MBSA. The approach tries to use informal abstracted models to investigate failure behaviors. The approach will combine various concepts such as directed graph traversal, event lists and Constraint Satisfaction Problems (CSP). The approach is implemented using an Object Oriented programming language. The components are abstracted to its failure logic and relationships of connected components. The implemented approach is tested on various flight control systems, including electrical and multi-domain examples. The various tests are analyzed, and a comparison to different approaches is represented.Keywords: flight control systems, model based safety analysis, safety assessment analysis, system modelling
Procedia PDF Downloads 16626372 Polymersomes in Drug Delivery: A Comparative Review with Liposomes and Micelles
Authors: Salma E. Ahmed
Abstract:
Since the mid 50’s, enormous attention has been paid towards nanocarriers and their applications in drug and gene delivery. Among these vesicles, liposomes and micelles have been heavily investigated due to their many advantages over other types. Liposomes, for instance, are mostly distinguished by their ability to encapsulate hydrophobic, hydrophilic and amphiphilic drugs. Micelles, on the other hand, are self-assembled shells of lipids, amphiphilic or oppositely charged block copolymers that, once exposed to aqueous media, can entrap hydrophobic agents, and possess prolonged circulation in the bloodstream. Both carriers are considered compatible and biodegradable. Nevertheless, they have limited stabilities, chemical versatilities, and drug encapsulation efficiencies. In order to overcome these downsides, strategies for optimizing a novel drug delivery system that has the architecture of liposomes and polymeric characteristics of micelles have been evolved. Polymersomes are vehicles with fluidic cores and hydrophobic shells that are protected and isolated from the aqueous media by the hydrated hydrophilic brushes which give the carrier its distinctive polymeric bilayer shape. Similar to liposomes, this merit enables the carrier to encapsulate a wide range of agents, despite their affinities and solubilities in water. Adding to this, the high molecular weight of the amphiphiles that build the body of the polymersomes increases their colloidal and chemical stabilities and reduces the permeability of the polymeric membranes, which makes the vesicles more protective to the encapsulated drug. These carriers can also be modified in ways that make them responsive when targeted or triggered, by manipulating their composition and attaching moieties and conjugates to the body of the carriers. These appealing characteristics, in addition to the ease of synthesis, gave the polymersomes greater potentials in the area of drug delivery. Thus, their design and characterization, in comparison with liposomes and micelles, are briefly reviewed in this work.Keywords: controlled release, liposomes, micelles, polymersomes, targeting
Procedia PDF Downloads 19726371 Marginalisation of an Age Old Culture. The Case of Female Cultural Initiation in Some South African Cultural Groups
Authors: Lesibana Rafapa
Abstract:
Accounts exist of circumcision-anchored cultural initiation in central Africa, East Africa, Southern Africa, North Africa, and West Africa -straddling states like Botswana, Kenya, Lesotho, Malawi, Senegal, South Africa, Zambia, and Zimbabwe. This attests to the continent-wide spread of this cultural practice. In this paper, the writer relates the cultural aspect of circumcision-subsuming initiation among black African cultural groups across the continent to the notion that African cultures are varied yet subscribe to a common central concept. The premise of the paper is that the common practice of initiation for both male and female children that have to be initiated by adults to the tradition and customs of a people coincides with such a central concept. The practice of traditional initiation is as broad as to encompass aspects of spirituality, morality, and social organisation, in the nature of the central concept of which it is a trans-sectional part. Cultural initiation, sometimes referred to as traditional circumcision, constitutes culture-determined rites of passage for the initiates. The study’s aim, the findings of which are presented in this paper, was to probe gender equality in the development and promotion of the cultural practice of initiation. The researcher intended to demonstrate how in South Africa, female circumcision is treated equally or marginalised in efforts of the democratic government to regulate and strengthen the practice of circumcision as part of its broader liberation programme meant to reverse politico-cultural bondage experienced during apartheid rule that the present black regime helped bring to an end. It is argued that the failure to regard female circumcision as equal to its male counterpart is a travesty of the black government’s legislation and policies espousing equality and the protection and empowerment of vulnerable and previously marginalised population groups that include black women. The writer did a desk-top study of the history and characteristics of female circumcision among the black Northern Sotho, VaTsonga, and VhaVenda cultural groups of the Limpopo Province, stretching north to the border of South Africa with Zimbabwe, as well as literature on how political and other authorities exert efforts to preserve and empower the practice. The findings were that male initiation is foregrounded and totalised to represent the practice of initiation as a whole, at the expense of its female counterpart facing marginalisation and unequal regard. It is outlined in this paper how such impoverishment of an otherwise woman-empowering cultural practice deprives hitherto black cultures that suffered brutal repression during apartheid of a fuller recovery much needed in the democratic era. The writer applies some aspects of postcolonial theory and some tropes of feminism in the discussion of an uneven status of cultural circumcision at the hands of present day powers that be.Keywords: African cultures, female circumcision, gender equality, women empowerment
Procedia PDF Downloads 18726370 Design of Knowledge Management System with Geographic Information System
Authors: Angga Hidayah Ramadhan, Luciana Andrawina, M. Azani Hasibuan
Abstract:
Data will be as a core of the decision if it has a good treatment or process, which is process that data into information, and information into knowledge to make a wisdom or decision. Today, many companies have not realize it include XYZ University Admission Directorate as executor of National Admission called Seleksi Masuk Bersama (SMB) that during the time, the workers only uses their feeling to make a decision. Whereas if it done, then that company can analyze the data to make a right decision to get a pin sales from student candidate or registrant that follow SMB as many as possible. Therefore, needs Knowledge Management System (KMS) with Geographic Information System (GIS) use 5C4C that can process that company data becomes more useful and can help make decisions. This information system can process data into information based on the pin sold data with 5C (Contextualized, Categorize, Calculation, Correction, Condensed) and convert information into knowledge with 4C (Comparing, Consequence, Connection, Conversation) that has been several steps until these data can be useful to make easier to take a decision or wisdom, resolve problems, communicate, and quicker to learn to the employees have not experience and also for ease of viewing/visualization based on spatial data that equipped with GIS functionality that can be used to indicate events in each province with indicator that facilitate in this system. The system also have a function to save the tacit on the system then to be proceed into explicit in expert system based on the problems that will be found from the consequences of information. With the system each team can make a decision with same ways, structured, and the important is based on the actual event/data.Keywords: 5C4C, data, information, knowledge
Procedia PDF Downloads 46526369 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML
Procedia PDF Downloads 13926368 Remote Vital Signs Monitoring in Neonatal Intensive Care Unit Using a Digital Camera
Authors: Fatema-Tuz-Zohra Khanam, Ali Al-Naji, Asanka G. Perera, Kim Gibson, Javaan Chahl
Abstract:
Conventional contact-based vital signs monitoring sensors such as pulse oximeters or electrocardiogram (ECG) may cause discomfort, skin damage, and infections, particularly in neonates with fragile, sensitive skin. Therefore, remote monitoring of the vital sign is desired in both clinical and non-clinical settings to overcome these issues. Camera-based vital signs monitoring is a recent technology for these applications with many positive attributes. However, there are still limited camera-based studies on neonates in a clinical setting. In this study, the heart rate (HR) and respiratory rate (RR) of eight infants at the Neonatal Intensive Care Unit (NICU) in Flinders Medical Centre were remotely monitored using a digital camera applying color and motion-based computational methods. The region-of-interest (ROI) was efficiently selected by incorporating an image decomposition method. Furthermore, spatial averaging, spectral analysis, band-pass filtering, and peak detection were also used to extract both HR and RR. The experimental results were validated with the ground truth data obtained from an ECG monitor and showed a strong correlation using the Pearson correlation coefficient (PCC) 0.9794 and 0.9412 for HR and RR, respectively. The RMSE between camera-based data and ECG data for HR and RR were 2.84 beats/min and 2.91 breaths/min, respectively. A Bland Altman analysis of the data also showed a close correlation between both data sets with a mean bias of 0.60 beats/min and 1 breath/min, and the lower and upper limit of agreement -4.9 to + 6.1 beats/min and -4.4 to +6.4 breaths/min for both HR and RR, respectively. Therefore, video camera imaging may replace conventional contact-based monitoring in NICU and has potential applications in other contexts such as home health monitoring.Keywords: neonates, NICU, digital camera, heart rate, respiratory rate, image decomposition
Procedia PDF Downloads 10726367 Developing Artificial Neural Networks (ANN) for Falls Detection
Authors: Nantakrit Yodpijit, Teppakorn Sittiwanchai
Abstract:
The number of older adults is rising rapidly. The world’s population becomes aging. Falls is one of common and major health problems in the elderly. Falls may lead to acute and chronic injuries and deaths. The fall-prone individuals are at greater risk for decreased quality of life, lowered productivity and poverty, social problems, and additional health problems. A number of studies on falls prevention using fall detection system have been conducted. Many available technologies for fall detection system are laboratory-based and can incur substantial costs for falls prevention. The utilization of alternative technologies can potentially reduce costs. This paper presents the new design and development of a wearable-based fall detection system using an Accelerometer and Gyroscope as motion sensors for the detection of body orientation and movement. Algorithms are developed to differentiate between Activities of Daily Living (ADL) and falls by comparing Threshold-based values with Artificial Neural Networks (ANN). Results indicate the possibility of using the new threshold-based method with neural network algorithm to reduce the number of false positive (false alarm) and improve the accuracy of fall detection system.Keywords: aging, algorithm, artificial neural networks (ANN), fall detection system, motion sensorsthreshold
Procedia PDF Downloads 49826366 A Reinforcement Learning Based Method for Heating, Ventilation, and Air Conditioning Demand Response Optimization Considering Few-Shot Personalized Thermal Comfort
Authors: Xiaohua Zou, Yongxin Su
Abstract:
The reasonable operation of heating, ventilation, and air conditioning (HVAC) is of great significance in improving the security, stability, and economy of power system operation. However, the uncertainty of the operating environment, thermal comfort varies by users and rapid decision-making pose challenges for HVAC demand response optimization. In this regard, this paper proposes a reinforcement learning-based method for HVAC demand response optimization considering few-shot personalized thermal comfort (PTC). First, an HVAC DR optimization framework based on few-shot PTC model and DRL is designed, in which the output of few-shot PTC model is regarded as the input of DRL. Then, a few-shot PTC model that distinguishes between awake and asleep states is established, which has excellent engineering usability. Next, based on soft actor criticism, an HVAC DR optimization algorithm considering the user’s PTC is designed to deal with uncertainty and make decisions rapidly. Experiment results show that the proposed method can efficiently obtain use’s PTC temperature, reduce energy cost while ensuring user’s PTC, and achieve rapid decision-making under uncertainty.Keywords: HVAC, few-shot personalized thermal comfort, deep reinforcement learning, demand response
Procedia PDF Downloads 9026365 Insights into Insect Vectors: Liberibacter Interactions
Authors: Murad Ghanim
Abstract:
The citrus greening disease, also known as Huanglongbing, caused by the phloem-limited bacterium Candidatus Liberibacter asiaticus (CLas) has resulted in tremendous losses and the death of millions of citrus trees worldwide. CLas is transmitted by the Asian citrus psyllid (ACP) Diaphorina citri. The closely-related bacterium Candidatus Liberibacter solanacearum (CLso), which is associated with vegetative disorders in carrots and the zebra chips disease in potatoes, is transmitted by other psyllid species including Bactericera trigonica in carrots and B. ckockerelli in potatoes. Chemical sprays are currently the prevailing method for managing these diseases for limiting psyllid populations; however, they are limited in their effectiveness. A promising approach to prevent the transmission of these pathogens is to interfere with the vector-pathogen interactions, but our understanding of these processes is very limited. CLas induces changes in the nuclear architecture in the midgut of ACP and activates programmed cell death (apoptosis) in this organ. Strikingly, CLso displayed an opposite effect in the gut of B. trigonica, showing limited apoptosis, but widespread necrosis. Electron and fluorescent microscopy further showed that CLas induced the formation of Endoplasmic reticulum (ER) inclusion- and replication-like bodies, in which it increases and multiplies. ER involvement in bacterial replication is hypothesized to be the first stage of an immune response leading to the apoptotic and necrotic responses. ER exploitation and the subsequent events that lead to these cellular and stress responses might activate a cascade of molecular responses ending up with apoptosis and necrosis. Understanding the molecular interactions that underlay the necrotic/apoptotic responses to the bacteria will increase our knowledge of ACP-CLas, and BT-CLso interactions, and will set the foundation for developing novel, and efficient strategies to disturb these interactions and inhibit the transmission.Keywords: Liberibacter, psyllid, transmission, apoptosis, necrosis
Procedia PDF Downloads 14626364 A Quantitative Analysis for the Correlation between Corporate Financial and Social Performance
Authors: Wafaa Salah, Mostafa A. Salama, Jane Doe
Abstract:
Recently, the corporate social performance (CSP) is not less important than the corporate financial performance (CFP). Debate still exists about the nature of the relationship between the CSP and CFP, whether it is a positive, negative or a neutral correlation. The objective of this study is to explore the relationship between corporate social responsibility (CSR) reports and CFP. The study uses the accounting-based and market-based quantitative measures to quantify the financial performance of seven organizations listed on the Egyptian Stock Exchange in 2007-2014. Then uses the information retrieval technologies to quantify the contribution of each of the three dimensions of the corporate social responsibility report (environmental, social and economic). Finally, the correlation between these two sets of variables is viewed together in a model to detect the correlations between them. This model is applied on seven firms that generate social responsibility reports. The results show a positive correlation between the Earnings per share (market based measure) and the economical dimension in the CSR report. On the other hand, total assets and property, plant and equipment (accounting-based measure) are positively correlated to the environmental and social dimensions of the CSR reports. While there is not any significant relationship between ROA, ROE, Operating income and corporate social responsibility. This study contributes to the literature by providing more clarification of the relationship between CFP and the isolated CSR activities in a developing country.Keywords: financial, social, machine learning, corporate social performance, corporate social responsibility
Procedia PDF Downloads 31326363 Domain-Specific Deep Neural Network Model for Classification of Abnormalities on Chest Radiographs
Authors: Nkechinyere Joy Olawuyi, Babajide Samuel Afolabi, Bola Ibitoye
Abstract:
This study collected a preprocessed dataset of chest radiographs and formulated a deep neural network model for detecting abnormalities. It also evaluated the performance of the formulated model and implemented a prototype of the formulated model. This was with the view to developing a deep neural network model to automatically classify abnormalities in chest radiographs. In order to achieve the overall purpose of this research, a large set of chest x-ray images were sourced for and collected from the CheXpert dataset, which is an online repository of annotated chest radiographs compiled by the Machine Learning Research Group, Stanford University. The chest radiographs were preprocessed into a format that can be fed into a deep neural network. The preprocessing techniques used were standardization and normalization. The classification problem was formulated as a multi-label binary classification model, which used convolutional neural network architecture to make a decision on whether an abnormality was present or not in the chest radiographs. The classification model was evaluated using specificity, sensitivity, and Area Under Curve (AUC) score as the parameter. A prototype of the classification model was implemented using Keras Open source deep learning framework in Python Programming Language. The AUC ROC curve of the model was able to classify Atelestasis, Support devices, Pleural effusion, Pneumonia, A normal CXR (no finding), Pneumothorax, and Consolidation. However, Lung opacity and Cardiomegaly had a probability of less than 0.5 and thus were classified as absent. Precision, recall, and F1 score values were 0.78; this implies that the number of False Positive and False Negative is the same, revealing some measure of label imbalance in the dataset. The study concluded that the developed model is sufficient to classify abnormalities present in chest radiographs into present or absent.Keywords: transfer learning, convolutional neural network, radiograph, classification, multi-label
Procedia PDF Downloads 13226362 Electrophysiological Correlates of Statistical Learning in Children with and without Developmental Language Disorder
Authors: Ana Paula Soares, Alexandrina Lages, Helena Oliveira, Francisco-Javier Gutiérrez-Domínguez, Marisa Lousada
Abstract:
From an early age, exposure to a spoken language allows us to implicitly capture the structure underlying the succession of the speech sounds in that language and to segment it into meaningful units (words). Statistical learning (SL), i.e., the ability to pick up patterns in the sensory environment even without intention or consciousness of doing it, is thus assumed to play a central role in the acquisition of the rule-governed aspects of language and possibly to lie behind the language difficulties exhibited by children with development language disorder (DLD). The research conducted so far has, however, led to inconsistent results, which might stem from the behavioral tasks used to test SL. In a classic SL experiment, participants are first exposed to a continuous stream (e.g., syllables) in which, unbeknownst to the participants, stimuli are grouped into triplets that always appear together in the stream (e.g., ‘tokibu’, ‘tipolu’), with no pauses between each other (e.g., ‘tokibutipolugopilatokibu’) and without any information regarding the task or the stimuli. Following exposure, SL is assessed by asking participants to discriminate between triplets previously presented (‘tokibu’) from new sequences never presented together during exposure (‘kipopi’), i.e., to perform a two-alternative-forced-choice (2-AFC) task. Despite the widespread use of the 2-AFC to test SL, it has come under increasing criticism as it is an offline post-learning task that only assesses the result of the learning that had occurred during the previous exposure phase and that might be affected by other factors beyond the computation of regularities embedded in the input, typically the likelihood two syllables occurring together, a statistic known as transitional probability (TP). One solution to overcome these limitations is to assess SL as exposure to the stream unfolds using online techniques such as event-related potentials (ERP) that is highly sensitive to the time-course of the learning in the brain. Here we collected ERPs to examine the neurofunctional correlates of SL in preschool children with DLD, and chronological-age typical language development (TLD) controls who were exposed to an auditory stream in which eight three-syllable nonsense words, four of which presenting high-TPs and the other four low-TPs, to further analyze whether the ability of DLD and TLD children to extract-word-like units from the steam was modulated by words’ predictability. Moreover, to ascertain if the previous knowledge of the to-be-learned-regularities affected the neural responses to high- and low-TP words, children performed the auditory SL task, firstly, under implicit, and, subsequently, under explicit conditions. Although behavioral evidence of SL was not obtained in either group, the neural responses elicited during the exposure phases of the SL tasks differentiated children with DLD from children with TLD. Specifically, the results indicated that only children from the TDL group showed neural evidence of SL, particularly in the SL task performed under explicit conditions, firstly, for the low-TP, and, subsequently, for the high-TP ‘words’. Taken together, these findings support the view that children with DLD showed deficits in the extraction of the regularities embedded in the auditory input which might underlie the language difficulties.Keywords: development language disorder, statistical learning, transitional probabilities, word segmentation
Procedia PDF Downloads 18926361 Static vs. Stream Mining Trajectories Similarity Measures
Authors: Musaab Riyadh, Norwati Mustapha, Dina Riyadh
Abstract:
Trajectory similarity can be defined as the cost of transforming one trajectory into another based on certain similarity method. It is the core of numerous mining tasks such as clustering, classification, and indexing. Various approaches have been suggested to measure similarity based on the geometric and dynamic properties of trajectory, the overlapping between trajectory segments, and the confined area between entire trajectories. In this article, an evaluation of these approaches has been done based on computational cost, usage memory, accuracy, and the amount of data which is needed in advance to determine its suitability to stream mining applications. The evaluation results show that the stream mining applications support similarity methods which have low computational cost and memory, single scan on data, and free of mathematical complexity due to the high-speed generation of data.Keywords: global distance measure, local distance measure, semantic trajectory, spatial dimension, stream data mining
Procedia PDF Downloads 39626360 Active Learning Based on Science Experiments to Improve Scientific Literacy
Authors: Kunihiro Kamataki
Abstract:
In this study, active learning based on simple science experiments was developed in a university class of the freshman, in order to improve their scientific literacy. Through the active learning based on simple experiments of generation of cloud in a plastic bottle, students increased the interest in the global atmospheric problem and were able to discuss and find solutions about this problem positively from various viewpoints of the science technology, the politics, the economy, the diplomacy and the relations among nations. The results of their questionnaires and free descriptions of this class indicate that they improve the scientific literacy and motivations of other classroom lectures to acquire knowledge. It is thus suggested that the science experiment is strong tool to improve their intellectual curiosity rapidly and the connections that link the impression of science experiment and their interest of the social problem is very important to enhance their learning effect in this education.Keywords: active learning, scientific literacy, simple scientific experiment, university education
Procedia PDF Downloads 26226359 Usability and Biometric Authentication of Electronic Voting System
Authors: Nighat Ayub, Masood Ahmad
Abstract:
In this paper, a new voting system is developed and its usability is evaluated. The main feature of this system is the biometric verification of the voter and then a few easy steps to cast a vote. As compared to existing systems available, e.g dual vote, the new system requires no training in advance. The security is achieved via multiple key concept (another part of this project). More than 100 student voters were participated in the election from University of Malakanad, Chakdara, PK. To achieve the reliability, the voters cast their votes in two ways, i.e. paper based and electronic based voting using our new system. The results of paper based and electronic voting system are compared and it is concluded that the voters cast their votes for the intended candidates on the electronic voting system. The voters were requested to fill a questionnaire and the results of the questionnaire are carefully analyzed. The results show that the new system proposed in this paper is more secure and usable than other systems.Keywords: e-voting, security, usability, authentication
Procedia PDF Downloads 39626358 Understanding Rural Teachers’ Perceived Intention of Using Play in ECCE Mathematics Classroom: Strength-Based Approach
Authors: Nyamela M. ‘Masekhohola, Khanare P. Fumane
Abstract:
The Lesotho downward trend in mathematics attainment at all levels is compounded by the absence of innovative approaches to teaching and learning in Early Childhood. However, studies have shown that play pedagogy can be used to mitigate the challenges of mathematics education. Despite the benefits of play pedagogy to rural learners, its full potential has not been realized in early childhood care and education classrooms to improve children’s performance in mathematics because the adoption of play pedagogy depends on a strength-based approach. The study explores the potential of play pedagogy to improve mathematics education in early childhood care and education in Lesotho. Strength-based approach is known for its advocacy of recognizing and utilizing children’s strengths, capacities and interests. However, this approach and its promisingattributes is not well-known in Lesotho. In particular, little is known about the attributes of play pedagogy that are essential to improve mathematic education in ECCE programs in Lesotho. To identify such attributes and strengthen mathematics education, this systematic review examines evidence published on the strengths of play pedagogy that supports the teaching and learning of mathematics education in ECCE. The purpose of this review is, therefore, to identify and define the strengths of play pedagogy that supports mathematics education. Moreover, the study intends to understand the rural teachers’ perceived intention of using play in ECCE math classrooms through a strength-based approach. Eight key strengths were found (cues for reflection, edutainment, mathematics language development, creativity and imagination, cognitive promotion, exploration, classification, and skills development). This study is the first to identify and define the strength-based attributes of play pedagogy to improve the teaching and learning of mathematics in ECCE centers in Lesotho. The findings reveal which opportunities teachers find important for improving the teaching of mathematics as early as in ECCE programs. We conclude by discussing the implications of the literature for stimulating dialogues towards formulating strength-based approaches to teaching mathematics, as well as reflecting on the broader contributions of play pedagogy as an asset to improve mathematics in Lesotho and beyond.Keywords: early childhood education, mathematics education, lesotho, play pedagogy, strength-based approach.
Procedia PDF Downloads 14626357 Hierarchical Clustering Algorithms in Data Mining
Authors: Z. Abdullah, A. R. Hamdan
Abstract:
Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.Keywords: clustering, unsupervised learning, algorithms, hierarchical
Procedia PDF Downloads 88926356 A Study on the Correlation Analysis between the Pre-Sale Competition Rate and the Apartment Unit Plan Factor through Machine Learning
Authors: Seongjun Kim, Jinwooung Kim, Sung-Ah Kim
Abstract:
The development of information and communication technology also affects human cognition and thinking, especially in the field of design, new techniques are being tried. In architecture, new design methodologies such as machine learning or data-driven design are being applied. In particular, these methodologies are used in analyzing the factors related to the value of real estate or analyzing the feasibility in the early planning stage of the apartment housing. However, since the value of apartment buildings is often determined by external factors such as location and traffic conditions, rather than the interior elements of buildings, data is rarely used in the design process. Therefore, although the technical conditions are provided, the internal elements of the apartment are difficult to apply the data-driven design in the design process of the apartment. As a result, the designers of apartment housing were forced to rely on designer experience or modular design alternatives rather than data-driven design at the design stage, resulting in a uniform arrangement of space in the apartment house. The purpose of this study is to propose a methodology to support the designers to design the apartment unit plan with high consumer preference by deriving the correlation and importance of the floor plan elements of the apartment preferred by the consumers through the machine learning and reflecting this information from the early design process. The data on the pre-sale competition rate and the elements of the floor plan are collected as data, and the correlation between pre-sale competition rate and independent variables is analyzed through machine learning. This analytical model can be used to review the apartment unit plan produced by the designer and to assist the designer. Therefore, it is possible to make a floor plan of apartment housing with high preference because it is possible to feedback apartment unit plan by using trained model when it is used in floor plan design of apartment housing.Keywords: apartment unit plan, data-driven design, design methodology, machine learning
Procedia PDF Downloads 26926355 Integration of Polarization States and Color Multiplexing through a Singular Metasurface
Authors: Tarik Sipahi
Abstract:
Photonics research continues to push the boundaries of optical science, and the development of metasurface technology has emerged as a transformative force in this domain. The work presents the intricacies of a unified metasurface design tailored for efficient polarization and color control in optical systems. The proposed unified metasurface serves as a singular, nanoengineered optical element capable of simultaneous polarization modulation and color encoding. Leveraging principles from metamaterials and nanophotonics, this design allows for unprecedented control over the behavior of light at the subwavelength scale. The metasurface's spatially varying architecture enables seamless manipulation of both polarization states and color wavelengths, paving the way for a paradigm shift in optical system design. The advantages of this unified metasurface are diverse and impactful. By consolidating functions that traditionally require multiple optical components, the design streamlines optical systems, reducing complexity and enhancing overall efficiency. This approach is particularly promising for applications where compactness, weight considerations, and multifunctionality are crucial. Furthermore, the proposed unified metasurface design not only enhances multifunctionality but also addresses key challenges in optical system design, offering a versatile solution for applications demanding compactness and lightweight structures. The metasurface's capability to simultaneously manipulate polarization and color opens new possibilities in diverse technological fields. The research contributes to the evolution of optical science by showcasing the transformative potential of metasurface technology, emphasizing its role in reshaping the landscape of optical system architectures. This work represents a significant step forward in the ongoing pursuit of pushing the boundaries of photonics, providing a foundation for future innovations in compact and efficient optical devices.Keywords: metasurface, nanophotonics, optical system design, polarization control
Procedia PDF Downloads 5526354 Deep Learning for SAR Images Restoration
Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo Ferraioli
Abstract:
In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring. SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.Keywords: SAR image, polarimetric SAR image, convolutional neural network, deep learnig, deep neural network
Procedia PDF Downloads 7226353 On the Exergy Analysis of the Aluminum Smelter
Authors: Ayoola T. Brimmo, Mohamed I. Hassan
Abstract:
The push to mitigate the aluminum smelting industry’s enormous energy consumption and high emission releases is now even more persistent with the recent climate change happenings. Common approaches to achieve this have been focused on improving energy efficiency in the pot line and cast house sections of the smelter. However, the conventional energy efficiency analyses are based on the first law of thermodynamics, which do not shed proper light on the smelter’s degradation of energy. This just gives a general idea of the furnace’s performance with no reference to locations where improvement is a possibility based on the second law of thermodynamics. In this study, we apply exergy analyses on the pot line and cast house sections of the smelter to identify the locality and causes of energy degradation. The exergy analyses, which are based on a real life smelter conditions, highlight the possible locations for technology improvement in a typical smelter. With this established, methods of minimizing the smelter’s exergy losses are assessed.Keywords: exergy analysis, electrolytic cell, furnace, heat transfer
Procedia PDF Downloads 28926352 A Multi-Output Network with U-Net Enhanced Class Activation Map and Robust Classification Performance for Medical Imaging Analysis
Authors: Jaiden Xuan Schraut, Leon Liu, Yiqiao Yin
Abstract:
Computer vision in medical diagnosis has achieved a high level of success in diagnosing diseases with high accuracy. However, conventional classifiers that produce an image to-label result provides insufficient information for medical professionals to judge and raise concerns over the trust and reliability of a model with results that cannot be explained. In order to gain local insight into cancerous regions, separate tasks such as imaging segmentation need to be implemented to aid the doctors in treating patients, which doubles the training time and costs which renders the diagnosis system inefficient and difficult to be accepted by the public. To tackle this issue and drive AI-first medical solutions further, this paper proposes a multi-output network that follows a U-Net architecture for image segmentation output and features an additional convolutional neural networks (CNN) module for auxiliary classification output. Class activation maps are a method of providing insight into a convolutional neural network’s feature maps that leads to its classification but in the case of lung diseases, the region of interest is enhanced by U-net-assisted Class Activation Map (CAM) visualization. Therefore, our proposed model combines image segmentation models and classifiers to crop out only the lung region of a chest X-ray’s class activation map to provide a visualization that improves the explainability and is able to generate classification results simultaneously which builds trust for AI-led diagnosis systems. The proposed U-Net model achieves 97.61% accuracy and a dice coefficient of 0.97 on testing data from the COVID-QU-Ex Dataset which includes both diseased and healthy lungs.Keywords: multi-output network model, U-net, class activation map, image classification, medical imaging analysis
Procedia PDF Downloads 20526351 Possible Reasons for and Consequences of Generalizing Subgroup-Based Measurement Results to Populations: Based on Research Studies Conducted by Elementary Teachers in South Korea
Authors: Jaejun Jong
Abstract:
Many teachers in South Korea conduct research to improve the quality of their instruction. Unfortunately, many researchers generalize the results of measurements based on one subgroup to other students or to the entire population, which can cause problems. This study aims to determine examples of possible problems resulting from generalizing measurements based on one subgroup to an entire population or another group. This study is needed, as teachers’ instruction and class quality significantly affect the overall quality of education, but the quality of research conducted by teachers can become questionable due to overgeneralization. Thus, finding potential problems of overgeneralization can improve the overall quality of education. The data in this study were gathered from 145 sixth-grade elementary school students in South Korea. The result showed that students in different classes could differ significantly in various ways; thus, generalizing the results of subgroups to an entire population can engender erroneous student predictions and evaluations, which can lead to inappropriate instruction plans. This result shows that finding the reasons for such overgeneralization can significantly improve the quality of education.Keywords: generalization, measurement, research methodology, teacher education
Procedia PDF Downloads 9526350 Task Scheduling and Resource Allocation in Cloud-based on AHP Method
Authors: Zahra Ahmadi, Fazlollah Adibnia
Abstract:
Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow
Procedia PDF Downloads 14726349 Microfluidic Paper-Based Electrochemical Biosensor
Authors: Ahmad Manbohi, Seyyed Hamid Ahmadi
Abstract:
A low-cost paper-based microfluidic device (PAD) for the multiplex electrochemical determination of glucose, uric acid, and dopamine in biological fluids was developed. Using wax printing, PAD containing a central zone, six channels, and six detection zones was fabricated, and the electrodes were printed on detection zones using pre-made electrodes template. For each analyte, two detection zones were used. The carbon working electrode was coated with chitosan-BSA (and enzymes for glucose and uric acid). To detect glucose and uric acid, enzymatic reactions were employed. These reactions involve enzyme-catalyzed redox reactions of the analytes and produce free electrons for electrochemical measurement. Calibration curves were linear (R² > 0.980) in the range of 0-80 mM for glucose, 0.09–0.9 mM for dopamine, and 0–50 mM for uric acid, respectively. Blood samples were successfully analyzed by the proposed method.Keywords: biological fluids, biomarkers, microfluidic paper-based electrochemical biosensors, Multiplex
Procedia PDF Downloads 28426348 Comparison Performance between PID and PD Controllers for 3 and 4 Cable-Based Robots
Authors: Fouad. Inel, Lakhdar. Khochemane
Abstract:
This article presents a comparative response specification performance between two controllers of three and four cable based robots for various applications. The main objective of this work is: The first is to use the direct and inverse geometric model to study and simulate the end effector position of the robot with three and four cables. A graphical user interface has been implemented in order to visualizing the position of the robot. Secondly, we present the determination of static and dynamic tensions and lengths of cables required to flow different trajectories. At the end, we study the response of our systems in closed loop with a Proportional-Integrated Derivative (PID) and Proportional-Integrated (PD) controllers then this last are compared the results of the same examples using MATLAB/Simulink; we found that the PID method gives the better performance, such as rapidly speed response, settling time, compared to PD controller.Keywords: parallel cable-based robots, geometric modeling, dynamic modeling, graphical user interface, open loop, PID/PD controllers
Procedia PDF Downloads 452