Search results for: automatic processing
3378 An Automatic Speech Recognition Tool for the Filipino Language Using the HTK System
Authors: John Lorenzo Bautista, Yoon-Joong Kim
Abstract:
This paper presents the development of a Filipino speech recognition tool using the HTK System. The system was trained from a subset of the Filipino Speech Corpus developed by the DSP Laboratory of the University of the Philippines-Diliman. The speech corpus was both used in training and testing the system by estimating the parameters for phonetic HMM-based (Hidden-Markov Model) acoustic models. Experiments on different mixture-weights were incorporated in the study. The phoneme-level word-based recognition of a 5-state HMM resulted in an average accuracy rate of 80.13 for a single-Gaussian mixture model, 81.13 after implementing a phoneme-alignment, and 87.19 for the increased Gaussian-mixture weight model. The highest accuracy rate of 88.70% was obtained from a 5-state model with 6 Gaussian mixtures.Keywords: Filipino language, Hidden Markov Model, HTK system, speech recognition
Procedia PDF Downloads 4803377 Selecting Answers for Questions with Multiple Answer Choices in Arabic Question Answering Based on Textual Entailment Recognition
Authors: Anes Enakoa, Yawei Liang
Abstract:
Question Answering (QA) system is one of the most important and demanding tasks in the field of Natural Language Processing (NLP). In QA systems, the answer generation task generates a list of candidate answers to the user's question, in which only one answer is correct. Answer selection is one of the main components of the QA, which is concerned with selecting the best answer choice from the candidate answers suggested by the system. However, the selection process can be very challenging especially in Arabic due to its particularities. To address this challenge, an approach is proposed to answer questions with multiple answer choices for Arabic QA systems based on Textual Entailment (TE) recognition. The developed approach employs a Support Vector Machine that considers lexical, semantic and syntactic features in order to recognize the entailment between the generated hypotheses (H) and the text (T). A set of experiments has been conducted for performance evaluation and the overall performance of the proposed method reached an accuracy of 67.5% with C@1 score of 80.46%. The obtained results are promising and demonstrate that the proposed method is effective for TE recognition task.Keywords: information retrieval, machine learning, natural language processing, question answering, textual entailment
Procedia PDF Downloads 1453376 Segmentation of Korean Words on Korean Road Signs
Authors: Lae-Jeong Park, Kyusoo Chung, Jungho Moon
Abstract:
This paper introduces an effective method of segmenting Korean text (place names in Korean) from a Korean road sign image. A Korean advanced directional road sign is composed of several types of visual information such as arrows, place names in Korean and English, and route numbers. Automatic classification of the visual information and extraction of Korean place names from the road sign images make it possible to avoid a lot of manual inputs to a database system for management of road signs nationwide. We propose a series of problem-specific heuristics that correctly segments Korean place names, which is the most crucial information, from the other information by leaving out non-text information effectively. The experimental results with a dataset of 368 road sign images show 96% of the detection rate per Korean place name and 84% per road sign image.Keywords: segmentation, road signs, characters, classification
Procedia PDF Downloads 4443375 Hand Gesture Detection via EmguCV Canny Pruning
Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae
Abstract:
Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.Keywords: canny pruning, hand recognition, machine learning, skin tracking
Procedia PDF Downloads 1853374 Effect of Biostimulants to Control the Phelipanche ramosa L. Pomel in Processing Tomato Crop
Authors: G. Disciglio, G. Gatta, F. Lops, A. Libutti, A. Tarantino, E. Tarantino
Abstract:
The experimental trial was carried out in open field at Foggia district (Apulia Region, Southern Italy), during the spring-summer season 2014, in order to evaluate the effect of four biostimulant products (RadiconÒ, Viormon plusÒ, LysodinÒ and SiaptonÒ 10L), compared with a control (no biostimulant), on the infestation of processing tomato crop (cv Dres) by the chlorophyll-lacking root parasite Phelipanche ramosa. Biostimulants consist in different categories of products (microbial inoculants, humic and fulvic acids, hydrolyzed proteins and aminoacids, seaweed extracts) which play various roles in plant growing, including the improvement of crop resistance and quali-quantitative characteristics of yield. The experimental trial was arranged according to a complete randomized block design with five treatments, each of one replicated three times. The processing tomato seedlings were transplanted on 5 May 2014. Throughout the crop cycle, P. ramosa infestation was assessed according to the number of emerged shoots (branched plants) counted in each plot, at 66, 78 and 92 day after transplanting. The tomato fruits were harvested at full-stage of maturity on 8 August 2014. From each plot, the marketable yield was measured and the quali-quantitative yield parameters (mean weight, dry matter content, colour coordinate, colour index and soluble solids content of the fruits) were determined. The whole dataset was tested according to the basic assumptions for the analysis of variance (ANOVA) and the differences between the means were determined using Tukey’s tests at the 5% probability level. The results of the study showed that none of the applied biostimulants provided a whole control of Phelipanche, although some positive effects were obtained from their application. To this respect, the RadiconÒ appeared to be the most effective in reducing the infestation of this root-parasite in tomato crop. This treatment also gave the higher tomato yield.Keywords: biostimulant, control methods, Phelipanche ramosa, tomato crop
Procedia PDF Downloads 3013373 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine
Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif
Abstract:
The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)
Procedia PDF Downloads 3723372 Deep Learning Approach for Colorectal Cancer’s Automatic Tumor Grading on Whole Slide Images
Authors: Shenlun Chen, Leonard Wee
Abstract:
Tumor grading is an essential reference for colorectal cancer (CRC) staging and survival prognostication. The widely used World Health Organization (WHO) grading system defines histological grade of CRC adenocarcinoma based on the density of glandular formation on whole slide images (WSI). Tumors are classified as well-, moderately-, poorly- or un-differentiated depending on the percentage of the tumor that is gland forming; >95%, 50-95%, 5-50% and <5%, respectively. However, manually grading WSIs is a time-consuming process and can cause observer error due to subjective judgment and unnoticed regions. Furthermore, pathologists’ grading is usually coarse while a finer and continuous differentiation grade may help to stratifying CRC patients better. In this study, a deep learning based automatic differentiation grading algorithm was developed and evaluated by survival analysis. Firstly, a gland segmentation model was developed for segmenting gland structures. Gland regions of WSIs were delineated and used for differentiation annotating. Tumor regions were annotated by experienced pathologists into high-, medium-, low-differentiation and normal tissue, which correspond to tumor with clear-, unclear-, no-gland structure and non-tumor, respectively. Then a differentiation prediction model was developed on these human annotations. Finally, all enrolled WSIs were processed by gland segmentation model and differentiation prediction model. The differentiation grade can be calculated by deep learning models’ prediction of tumor regions and tumor differentiation status according to WHO’s defines. If multiple WSIs were possessed by a patient, the highest differentiation grade was chosen. Additionally, the differentiation grade was normalized into scale between 0 to 1. The Cancer Genome Atlas, project COAD (TCGA-COAD) project was enrolled into this study. For the gland segmentation model, receiver operating characteristic (ROC) reached 0.981 and accuracy reached 0.932 in validation set. For the differentiation prediction model, ROC reached 0.983, 0.963, 0.963, 0.981 and accuracy reached 0.880, 0.923, 0.668, 0.881 for groups of low-, medium-, high-differentiation and normal tissue in validation set. Four hundred and one patients were selected after removing WSIs without gland regions and patients without follow up data. The concordance index reached to 0.609. Optimized cut off point of 51% was found by “Maxstat” method which was almost the same as WHO system’s cut off point of 50%. Both WHO system’s cut off point and optimized cut off point performed impressively in Kaplan-Meier curves and both p value of logrank test were below 0.005. In this study, gland structure of WSIs and differentiation status of tumor regions were proven to be predictable through deep leaning method. A finer and continuous differentiation grade can also be automatically calculated through above models. The differentiation grade was proven to stratify CAC patients well in survival analysis, whose optimized cut off point was almost the same as WHO tumor grading system. The tool of automatically calculating differentiation grade may show potential in field of therapy decision making and personalized treatment.Keywords: colorectal cancer, differentiation, survival analysis, tumor grading
Procedia PDF Downloads 1343371 Kindergarten Children’s Reactions to the COVID-19 Pandemic: Creating a Sense of Coherence
Authors: Bilha Paryente, Roni Gez Langerman
Abstract:
Background and Objectives: The current study focused on how kindergarten children have experienced the COVID-19 pandemic. The main goals were understanding children’s emotions, coping strategies, and thoughts regarding the presence of the COVID-19 virus in their daily lives, using the salute genic approach to study their sense of coherence, and to promote relevant professional instruction. Design and Method: Semistructured in-depth interviews were held with 130 five- to six-year-old children, with an equal number of boys and girls. All of the children were recruited from kindergartens affiliated with the state's secular education system. Results: Data were structured into three themes: 1) the child’s pandemic perception as manageable through meaningful accompanying and missing figures; 2) the child’s comprehension of the virus as dangerous, age differentiating, and contagious. 3) the child’s emotional processing of the pandemic as arousing fear of death and, through images, as thorny and as a monster. Conclusions: Results demonstrate the young children’s sense of coherence, characterized as extrapersonal perception, interpersonal coping, and intrapersonal emotional processing, and the need for greater acknowledgement of child-parent educators' informed interventions that could give children a partial feeling of the adult’s awareness of their needs.Keywords: kindergarten children, continuous stress, COVID-19, salutogenic approach
Procedia PDF Downloads 1773370 Influence of Silicon Carbide Particle Size and Thermo-Mechanical Processing on Dimensional Stability of Al 2124SiC Nanocomposite
Authors: Mohamed M. Emara, Heba Ashraf
Abstract:
This study is to investigation the effect of silicon carbide (SiC) particle size and thermo-mechanical processing on dimensional stability of aluminum alloy 2124. Three combinations of SiC weight fractions are investigated, 2.5, 5, and 10 wt. % with different SiC particle sizes (25 μm, 5 μm, and 100nm) were produced using mechanical ball mill. The standard testing samples were fabricated using powder metallurgy technique. Both samples, prior and after extrusion, were heated from room temperature up to 400ºC in a dilatometer at different heating rates, that is, 10, 20, and 40ºC/min. The analysis showed that for all materials, there was an increase in length change as temperature increased and the temperature sensitivity of aluminum alloy decreased in the presence of both micro and nano-sized silicon carbide. For all conditions, nanocomposites showed better dimensional stability compared to conventional Al 2124/SiC composites. The after extrusion samples showed better thermal stability and less temperature sensitivity for the aluminum alloy for both micro and nano-sized silicon carbide.Keywords: aluminum 2124 metal matrix composite, SiC nano-sized reinforcements, powder metallurgy, extrusion mechanical ball mill, dimensional stability
Procedia PDF Downloads 5263369 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine
Procedia PDF Downloads 1443368 A Comparative Human Rights Analysis of Expulsion as a Counterterrorism Instrument: An Evaluation of Belgium
Authors: Louise Reyntjens
Abstract:
Where criminal law used to be the traditional response to cope with the terrorist threat, European governments are increasingly relying on administrative paths. The reliance on immigration law fits into this trend. Terrorism is seen as a civilization menace emanating from abroad. In this context, the expulsion of dangerous aliens, immigration law’s core task, is put forward as a key security tool. Governments all over Europe are focusing on removing dangerous individuals from their territory rather than bringing them to justice. This research reflects on the consequences for the expelled individuals’ fundamental rights. For this, the author selected four European countries for a comparative study: Belgium, France, the United Kingdom and Sweden. All these countries face similar social and security issues, igniting the recourse to immigration law as a counterterrorism tool. Yet, they adopt a very different approach on this: the United Kingdom positions itself on the repressive side of the spectrum. Sweden on the other hand, also 'securitized' its immigration policy after the recent terrorist hit in Stockholm, but remains on the tolerant side of the spectrum. Belgium and France are situated in between. This paper addresses the situation in Belgium. In 2017, the Belgian parliament introduced several legislative changes by which it considerably expanded and facilitated the possibility to expel unwanted aliens. First, the expulsion measure was subjected to new and questionably definitions: a serious attack on the nation’s safety used to be required to expel certain categories of aliens. Presently, mere suspicions suffice to fulfil the new definition of a 'serious threat to national security'. A definition which fails to respond to the principle of legality; the law, nor the prepatory works clarify what is meant by 'a threat to national security'. This creates the risk of submitting this concept’s interpretation almost entirely to the discretion of the immigration authorities. Secondly, in name of intervening more quickly and efficiently, the automatic suspensive appeal for expulsions was abolished. The European Court of Human Rights nonetheless requires such an automatic suspensive appeal under Article 13 and 3 of the Convention. Whether this procedural reform will stand to endure, is thus questionable. This contribution also raises questions regarding expulsion’s efficacy as a key security tool. In a globalized and mobilized world, particularly in a European Union with no internal boundaries, questions can be raised about the usefulness of this measure. Even more so, by simply expelling a dangerous individual, States avoid their responsibility and shift the risk to another State. Criminal law might in these instances be more capable of providing a conclusive and long term response. This contribution explores the human rights consequences of expulsion as a security tool in Belgium. It also offers a critical view on its efficacy for protecting national security.Keywords: Belgium, counter-terrorism and human rights, expulsion, immigration law
Procedia PDF Downloads 1273367 HD-WSComp: Hypergraph Decomposition for Web Services Composition Based on QoS
Authors: Samah Benmerbi, Kamal Amroun, Abdelkamel Tari
Abstract:
The increasing number of Web service (WS)providers throughout the globe, have produced numerous Web services providing the same or similar functionality. Therefore, there is a need of tools developing the best answer of queries by selecting and composing services with total transparency. This paper reviews various QoS based Web service selection mechanisms and architectures which facilitate qualitatively optimal selection, in other fact Web service composition is required when a request cannot be fulfilled by a single web service. In such cases, it is preferable to integrate existing web services to satisfy user’s request. We introduce an automatic Web service composition method based on hypergraph decomposition using hypertree decomposition method. The problem of selection and the composition of the web services is transformed into a resolution in a hypertree by exploring the relations of dependency between web services to get composite web service via employing an execution order of WS satisfying global request.Keywords: web service, web service selection, web service composition, QoS, hypergraph decomposition, BE hypergraph decomposition, hypertree resolution
Procedia PDF Downloads 5103366 Sepiolite as a Processing Aid in Fibre Reinforced Cement Produced in Hatschek Machine
Authors: R. Pérez Castells, J. M. Carbajo
Abstract:
Sepiolite is used as a processing aid in the manufacture of fibre cement from the start of the replacement of asbestos in the 80s. Sepiolite increases the inter-laminar bond between cement layers and improves homogeneity of the slurries. A new type of sepiolite processed product, Wollatrop TF/C, has been checked as a retention agent for fine particles in the production of fibre cement in a Hatschek machine. The effect of Wollatrop T/FC on filtering and fine particle losses was studied as well as the interaction with anionic polyacrylamide and microsilica. The design of the experiments were factorial and the VDT equipment used for measuring retention and drainage was modified Rapid Köethen laboratory sheet former. Wollatrop TF/C increased the fine particle retention improving the economy of the process and reducing the accumulation of solids in recycled process water. At the same time, drainage time increased sharply at high concentration, however drainage time can be improved by adjusting APAM concentration. Wollatrop TF/C and microsilica are having very small interactions among them. Microsilica does not control fine particle losses while Wollatrop TF/C does efficiently. Further research on APAM type (molecular weight and anionic character) is advisable to improve drainage.Keywords: drainage, fibre-reinforced cement, fine particle losses, flocculation, microsilica, sepiolite
Procedia PDF Downloads 3263365 Effect of Clinical Depression on Automatic Speaker Verification
Authors: Sheeraz Memon, Namunu C. Maddage, Margaret Lech, Nicholas Allen
Abstract:
The effect of a clinical environment on the accuracy of the speaker verification was tested. The speaker verification tests were performed within homogeneous environments containing clinically depressed speakers only, and non-depresses speakers only, as well as within mixed environments containing different mixtures of both climatically depressed and non-depressed speakers. The speaker verification framework included the MFCCs features and the GMM modeling and classification method. The speaker verification experiments within homogeneous environments showed 5.1% increase of the EER within the clinically depressed environment when compared to the non-depressed environment. It indicated that the clinical depression increases the intra-speaker variability and makes the speaker verification task more challenging. Experiments with mixed environments indicated that the increase of the percentage of the depressed individuals within a mixed environment increases the speaker verification equal error rates.Keywords: speaker verification, GMM, EM, clinical environment, clinical depression
Procedia PDF Downloads 3753364 Effect of Personality Traits on Classification of Political Orientation
Authors: Vesile Evrim, Aliyu Awwal
Abstract:
Today as in the other domains, there are an enormous number of political transcripts available in the Web which is waiting to be mined and used for various purposes such as statistics and recommendations. Therefore, automatically determining the political orientation on these transcripts becomes crucial. The methodologies used by machine learning algorithms to do the automatic classification are based on different features such as Linguistic. Considering the ideology differences between Liberals and Conservatives, in this paper, the effect of Personality Traits on political orientation classification is studied. This is done by considering the correlation between LIWC features and the BIG Five Personality Traits. Several experiments are conducted on Convote U.S. Congressional-Speech dataset with seven benchmark classification algorithms. The different methodologies are applied on selecting different feature sets that constituted by 8 to 64 varying number of features. While Neuroticism is obtained to be the most differentiating personality trait on classification of political polarity, when its top 10 representative features are combined with several classification algorithms, it outperformed the results presented in previous research.Keywords: politics, personality traits, LIWC, machine learning
Procedia PDF Downloads 4953363 Development of a Rating Scale for Elementary EFL Writing
Authors: Mohammed S. Assiri
Abstract:
In EFL programs, rating scales used in writing assessment are often constructed by intuition. Intuition-based scales tend to provide inaccurate and divisive ratings of learners’ writing performance. Hence, following an empirical approach, this study attempted to develop a rating scale for elementary-level writing at an EFL program in Saudi Arabia. Towards this goal, 98 students’ essays were scored and then coded using comprehensive taxonomy of writing constructs and their measures. An automatic linear modeling was run to find out which measures would best predict essay scores. A nonparametric ANOVA, the Kruskal-Wallis test, was then used to determine which measures could best differentiate among scoring levels. Findings indicated that there were certain measures that could serve as either good predictors of essay scores or differentiators among scoring levels, or both. The main conclusion was that a rating scale can be empirically developed using predictive and discriminative statistical tests.Keywords: analytic scoring, rating scales, writing assessment, writing constructs, writing performance
Procedia PDF Downloads 4633362 Analytical Comparison of Conventional Algorithms with Vedic Algorithm for Digital Multiplier
Authors: Akhilesh G. Naik, Dipankar Pal
Abstract:
In today’s scenario, the complexity of digital signal processing (DSP) applications and various microcontroller architectures have been increasing to such an extent that the traditional approaches to multiplier design in most processors are becoming outdated for being comparatively slow. Modern processing applications require suitable pipelined approaches, and therefore, algorithms that are friendlier with pipelined architectures. Traditional algorithms like Wallace Tree, Radix-4 Booth, Radix-8 Booth, Dadda architectures have been proven to be comparatively slow for pipelined architectures. These architectures, therefore, need to be optimized or combined with other architectures amongst them to enhance its performances and to be made suitable for pipelined hardware/architectures. Recently, Vedic algorithm mathematically has proven to be efficient by appearing to be less complex and with fewer steps for its output establishment and have assumed renewed importance. This paper describes and shows how the Vedic algorithm can be better suited for pipelined architectures and also can be combined with traditional architectures and algorithms for enhancing its ability even further. In this paper, we also established that for complex applications on DSP and other microcontroller architectures, using Vedic approach for multiplication proves to be the best available and efficient option.Keywords: Wallace Tree, Radix-4 Booth, Radix-8 Booth, Dadda, Vedic, Single-Stage Karatsuba (SSK), Looped Karatsuba (LK)
Procedia PDF Downloads 1693361 Parameters Tuning of a PID Controller on a DC Motor Using Honey Bee and Genetic Algorithms
Authors: Saeid Jalilzadeh
Abstract:
PID controllers are widely used to control the industrial plants because of their robustness and simple structures. Tuning of the controller's parameters to get a desired response is difficult and time consuming. With the development of computer technology and artificial intelligence in automatic control field, all kinds of parameters tuning methods of PID controller have emerged in endlessly, which bring much energy for the study of PID controller, but many advanced tuning methods behave not so perfect as to be expected. Honey Bee algorithm (HBA) and genetic algorithm (GA) are extensively used for real parameter optimization in diverse fields of study. This paper describes an application of HBA and GA to the problem of designing a PID controller whose parameters comprise proportionality constant, integral constant and derivative constant. Presence of three parameters to optimize makes the task of designing a PID controller more challenging than conventional P, PI, and PD controllers design. The suitability of the proposed approach has been demonstrated through computer simulation using MATLAB/SIMULINK.Keywords: controller, GA, optimization, PID, PSO
Procedia PDF Downloads 5443360 TechWhiz: Empowering Deaf Students through Inclusive Education
Authors: Paula Escudeiro, Nuno Escudeiro, Márcia Campos, Francisca Escudeiro
Abstract:
In today's world, technical and scientific knowledge plays a vital role in education, research, and employment. Deaf students face unique challenges in educational settings, particularly when it comes to understanding technical and scientific terminology. The reliance on written and spoken languages can create barriers for deaf individuals who primarily communicate using sign language. This lack of accessibility can hinder their learning experience and compromise equity in education. To address this issue, the TechWhiz project has been developed as a comprehensive glossary of scientific and technical concepts explained in sign language. By providing deaf students with access to education in their first language, TechWhiz aims to enhance their learning achievements and promote inclusivity while also fostering equity in education for all students.Keywords: deaf students, technical and scientific knowledge, automatic sign language, inclusive education
Procedia PDF Downloads 683359 Application of Medium High Hydrostatic Pressure in Preserving Textural Quality and Safety of Pineapple Compote
Authors: Nazim Uddin, Yohiko Nakaura, Kazutaka Yamamoto
Abstract:
Compote (fruit in syrup) of pineapple (Ananas comosus L. Merrill) is expected to have a high market potential as one of convenient ready-to-eat (RTE) foods worldwide. High hydrostatic pressure (HHP) in combination with low temperature (LT) was applied to the processing of pineapple compote as well as medium HHP (MHHP) in combination with medium-high temperature (MHT) since both processes can enhance liquid impregnation and inactivate microbes. MHHP+MHT (55 or 65 °C) process, as well as the HHP+LT process, has successfully inactivated the microbes in the compote to a non-detectable level. Although the compotes processed by MHHP+MHT or HHP+LT have lost the fresh texture as in a similar manner as those processed solely by heat, it was indicated that the texture degradations by heat were suppressed under MHHP. Degassing process reduced the hardness, while calcium (Ca) contributed to be retained hardness in MHT and MHHP+MHT processes. Electrical impedance measurement supported the damage due to degassing and heat. The color, Brix, and appearance were not affected by the processing methods significantly. MHHP+MHT and HHP+LT processes may be applicable to produce high-quality, safe RTE pineapple compotes. Further studies on the optimization of packaging and storage condition will be indispensable for commercialization.Keywords: compote of pineapple, RTE, medium high hydrostatic pressure, postharvest loss, texture
Procedia PDF Downloads 1373358 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging
Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi
Abstract:
Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA
Procedia PDF Downloads 2803357 Analysis of Formation Methods of Range Profiles for an X-Band Coastal Surveillance Radar
Authors: Nguyen Van Loi, Le Thanh Son, Tran Trung Kien
Abstract:
The paper deals with the problem of the formation of range profiles (RPs) for an X-band coastal surveillance radar. Two popular methods, the difference operator method, and the window-based method, are reviewed and analyzed via two tests with different datasets. The test results show that although the original window-based method achieves a better performance than the difference operator method, it has three main drawbacks that are the use of 3 or 4 peaks of an RP for creating the windows, the extension of the window size using the power sum of three adjacent cells in the left and the right sides of the windows and the same threshold applied for all types of vessels to finish the formation process of RPs. These drawbacks lead to inaccurate RPs due to the low signal-to-clutter ratio. Therefore, some suggestions are proposed to improve the original window-based method.Keywords: range profile, difference operator method, window-based method, automatic target recognition
Procedia PDF Downloads 1273356 Design and Realization of Double-Delay Line Canceller (DDLC) Using Fpga
Authors: A. E. El-Henawey, A. A. El-Kouny, M. M. Abd –El-Halim
Abstract:
Moving target indication (MTI) which is an anti-clutter technique that limits the display of clutter echoes. It uses the radar received information primarily to display moving targets only. The purpose of MTI is to discriminate moving targets from a background of clutter or slowly-moving chaff particles as shown in this paper. Processing system in these radars is so massive and complex; since it is supposed to perform a great amount of processing in very short time, in most radar applications the response of a single canceler is not acceptable since it does not have a wide notch in the stop-band. A double-delay canceler is an MTI delay-line canceler employing the two-delay-line configuration to improve the performance by widening the clutter-rejection notches, as compared with single-delay cancelers. This canceler is also called a double canceler, dual-delay canceler, or three-pulse canceler. In this paper, a double delay line canceler is chosen for study due to its simplicity in both concept and implementation. Discussing the implementation of a simple digital moving target indicator (DMTI) using FPGA which has distinct advantages compared to other application specific integrated circuit (ASIC) for the purposes of this work. The FPGA provides flexibility and stability which are important factors in the radar application.Keywords: FPGA, MTI, double delay line canceler, Doppler Shift
Procedia PDF Downloads 6443355 Correlation Analysis to Quantify Learning Outcomes for Different Teaching Pedagogies
Authors: Kanika Sood, Sijie Shang
Abstract:
A fundamental goal of education includes preparing students to become a part of the global workforce by making beneficial contributions to society. In this paper, we analyze student performance for multiple courses that involve different teaching pedagogies: a cooperative learning technique and an inquiry-based learning strategy. Student performance includes student engagement, grades, and attendance records. We perform this study in the Computer Science department for online and in-person courses for 450 students. We will perform correlation analysis to study the relationship between student scores and other parameters such as gender, mode of learning. We use natural language processing and machine learning to analyze student feedback data and performance data. We assess the learning outcomes of two teaching pedagogies for undergraduate and graduate courses to showcase the impact of pedagogical adoption and learning outcome as determinants of academic achievement. Early findings suggest that when using the specified pedagogies, students become experts on their topics and illustrate enhanced engagement with peers.Keywords: bag-of-words, cooperative learning, education, inquiry-based learning, in-person learning, natural language processing, online learning, sentiment analysis, teaching pedagogy
Procedia PDF Downloads 773354 A Study of Key Technologies for the Realization of Smart Grid and Its Research Situation in Pakistan and Abroad
Authors: Arjmand Khaliq, Pemra Sohaib
Abstract:
In this paper smart grid technologies which converts conventional grid into smart grid has been discussed. Integration of advanced technologies including two way communication, advanced control system, sensors, smart metering system and other provide opportunity to make conventional grid a intelligent and automatic system which is named as smart grid. This paper gives the concept of smart grid and functional characteristics of smart grid technology, summed up the research progress in Pakistan and abroad and the significance of developing smart grid. Based on the analysis of the smart grid, smart grid technologies will result a reliable and energy efficient power system in the future. On the other hand smart grid technologies have been reviewed in this paper highlighting the key technologies of smart grid, and points out the problems and challenges in the realization of smart grid.Keywords: energy, power system reliability, power system monitoring and control, sensor, smart grid, two-way communication
Procedia PDF Downloads 3963353 Generation of Photo-Mosaic Images through Block Matching and Color Adjustment
Authors: Hae-Yeoun Lee
Abstract:
Mosaic refers to a technique that makes image by gathering lots of small materials in various colours. This paper presents an automatic algorithm that makes the photomosaic image using photos. The algorithm is composed of four steps: Partition and feature extraction, block matching, redundancy removal and colour adjustment. The input image is partitioned in the small block to extract feature. Each block is matched to find similar photo in database by comparing similarity with Euclidean difference between blocks. The intensity of the block is adjusted to enhance the similarity of image by replacing the value of light and darkness with that of relevant block. Further, the quality of image is improved by minimizing the redundancy of tiles in the adjacent blocks. Experimental results support that the proposed algorithm is excellent in quantitative analysis and qualitative analysis.Keywords: photomosaic, Euclidean distance, block matching, intensity adjustment
Procedia PDF Downloads 2793352 Radar Signal Detection Using Neural Networks in Log-Normal Clutter for Multiple Targets Situations
Authors: Boudemagh Naime
Abstract:
Automatic radar detection requires some methods of adapting to variations in the background clutter in order to control their false alarm rate. The problem becomes more complicated in non-Gaussian environment. In fact, the conventional approach in real time applications requires a complex statistical modeling and much computational operations. To overcome these constraints, we propose another approach based on artificial neural network (ANN-CMLD-CFAR) using a Back Propagation (BP) training algorithm. The considered environment follows a log-normal distribution in the presence of multiple Rayleigh-targets. To evaluate the performances of the considered detector, several situations, such as scale parameter and the number of interferes targets, have been investigated. The simulation results show that the ANN-CMLD-CFAR processor outperforms the conventional statistical one.Keywords: radat detection, ANN-CMLD-CFAR, log-normal clutter, statistical modelling
Procedia PDF Downloads 3643351 Temporal Estimation of Hydrodynamic Parameter Variability in Constructed Wetlands
Authors: Mohammad Moezzibadi, Isabelle Charpentier, Adrien Wanko, Robert Mosé
Abstract:
The calibration of hydrodynamic parameters for subsurface constructed wetlands (CWs) is a sensitive process since highly non-linear equations are involved in unsaturated flow modeling. CW systems are engineered systems designed to favour natural treatment processes involving wetland vegetation, soil, and their microbial flora. Their significant efficiency at reducing the ecological impact of urban runoff has been recently proved in the field. Numerical flow modeling in a vertical variably saturated CW is here carried out by implementing the Richards model by means of a mixed hybrid finite element method (MHFEM), particularly well adapted to the simulation of heterogeneous media, and the van Genuchten-Mualem parametrization. For validation purposes, MHFEM results were compared to those of HYDRUS (a software based on a finite element discretization). As van Genuchten-Mualem soil hydrodynamic parameters depend on water content, their estimation is subject to considerable experimental and numerical studies. In particular, the sensitivity analysis performed with respect to the van Genuchten-Mualem parameters reveals a predominant influence of the shape parameters α, n and the saturated conductivity of the filter on the piezometric heads, during saturation and desaturation. Modeling issues arise when the soil reaches oven-dry conditions. A particular attention should also be brought to boundary condition modeling (surface ponding or evaporation) to be able to tackle different sequences of rainfall-runoff events. For proper parameter identification, large field datasets would be needed. As these are usually not available, notably due to the randomness of the storm events, we thus propose a simple, robust and low-cost numerical method for the inverse modeling of the soil hydrodynamic properties. Among the methods, the variational data assimilation technique introduced by Le Dimet and Talagrand is applied. To that end, a variational data assimilation technique is implemented by applying automatic differentiation (AD) to augment computer codes with derivative computations. Note that very little effort is needed to obtain the differentiated code using the on-line Tapenade AD engine. Field data are collected for a three-layered CW located in Strasbourg (Alsace, France) at the water edge of the urban water stream Ostwaldergraben, during several months. Identification experiments are conducted by comparing measured and computed piezometric head by means of the least square objective function. The temporal variability of hydrodynamic parameter is then assessed and analyzed.Keywords: automatic differentiation, constructed wetland, inverse method, mixed hybrid FEM, sensitivity analysis
Procedia PDF Downloads 1643350 Analysis of Translational Ship Oscillations in a Realistic Environment
Authors: Chen Zhang, Bernhard Schwarz-Röhr, Alexander Härting
Abstract:
To acquire accurate ship motions at the center of gravity, a single low-cost inertial sensor is utilized and applied on board to measure ship oscillating motions. As observations, the three axes accelerations and three axes rotational rates provided by the sensor are used. The mathematical model of processing the observation data includes determination of the distance vector between the sensor and the center of gravity in x, y, and z directions. After setting up the transfer matrix from sensor’s own coordinate system to the ship’s body frame, an extended Kalman filter is applied to deal with nonlinearities between the ship motion in the body frame and the observation information in the sensor’s frame. As a side effect, the method eliminates sensor noise and other unwanted errors. Results are not only roll and pitch, but also linear motions, in particular heave and surge at the center of gravity. For testing, we resort to measurements recorded on a small vessel in a well-defined sea state. With response amplitude operators computed numerically by a commercial software (Seaway), motion characteristics are estimated. These agree well with the measurements after processing with the suggested method.Keywords: extended Kalman filter, nonlinear estimation, sea trial, ship motion estimation
Procedia PDF Downloads 5233349 Image Segmentation Techniques: Review
Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo
Abstract:
Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.Keywords: clustering-based, convolution-network, edge-based, region-growing
Procedia PDF Downloads 97