Search results for: word processing
3046 Investigating Iraqi EFL University Students' Productive Knowledge of Grammatical Collocations in English
Authors: Adnan Z. Mkhelif
Abstract:
Grammatical collocations (GCs) are word combinations containing a preposition or a grammatical structure, such as an infinitive (e.g. smile at, interested in, easy to learn, etc.). Such collocations tend to be difficult for Iraqi EFL university students (IUS) to master. To help address this problem, it is important to identify the factors causing it. This study aims at investigating the effects of L2 proficiency, frequency of GCs and their transparency on IUSs’ productive knowledge of GCs. The study involves 112 undergraduate participants with different proficiency levels, learning English in formal contexts in Iraq. The data collection instruments include (but not limited to) a productive knowledge test (designed by the researcher using the British National Corpus (BNC)), as well as the grammar part of the Oxford Placement Test (OPT). The study findings have shown that all the above-mentioned factors have significant effects on IUSs’ productive knowledge of GCs. In addition to establishing evidence of which factors of L2 learning might be relevant to learning GCs, it is hoped that the findings of the present study will contribute to more effective methods of teaching that can better address and help overcome the problems IUSs encounter in learning GCs. The study is thus hoped to have significant theoretical and pedagogical implications for researchers, syllabus designers as well as teachers of English as a foreign/second language.Keywords: corpus linguistics, frequency, grammatical collocations, L2 vocabulary learning, productive knowledge, proficiency, transparency
Procedia PDF Downloads 2483045 Improved Image Retrieval for Efficient Localization in Urban Areas Using Location Uncertainty Data
Authors: Mahdi Salarian, Xi Xu, Rashid Ansari
Abstract:
Accurate localization of mobile devices based on camera-acquired visual media information usually requires a search over a very large GPS-referenced image database. This paper proposes an efficient method for limiting the search space for image retrieval engine by extracting and leveraging additional media information about Estimated Positional Error (EP E) to address complexity and accuracy issues in the search, especially to be used for compensating GPS location inaccuracy in dense urban areas. The improved performance is achieved by up to a hundred-fold reduction in the search area used in available reference methods while providing improved accuracy. To test our procedure we created a database by acquiring Google Street View (GSV) images for down town of Chicago. Other available databases are not suitable for our approach due to lack of EP E for the query images. We tested the procedure using more than 200 query images along with EP E acquired mostly in the densest areas of Chicago with different phones and in different conditions such as low illumination and from under rail tracks. The effectiveness of our approach and the effect of size and sector angle of the search area are discussed and experimental results demonstrate how our proposed method can improve performance just by utilizing a data that is available for mobile systems such as smart phones.Keywords: localization, retrieval, GPS uncertainty, bag of word
Procedia PDF Downloads 2833044 Tribological Properties of Non-Stick Coatings Used in Bread Baking Process
Authors: Maurice Brogly, Edwige Privas, Rajesh K. Gajendran, Sophie Bistac
Abstract:
Anti-sticky coatings based on perfluoroalkoxy (PFA) coatings are widely used in food processing industry especially for bread making. Their tribological performance, such as low friction coefficient, low surface energy and high heat resistance, make them an appropriate choice for anti-sticky coating application in moulds for food processing industry. This study is dedicated to evidence the transfer of contaminants from the coating due to wear and thermal ageing of the mould. The risk of contamination is induced by the damage of the coating by bread crust during the demoulding stage. The study focuses on the wear resistance and potential transfer of perfluorinated polymer from the anti-sticky coating. Friction between perfluorinated coating and bread crust is modeled by a tribological pin-on-disc test. The cellular nature of the bread crust is modeled by a polymer foam. FTIR analysis of the polymer foam after friction allow the evaluation of the transfer from the perfluorinated coating to polymer foam. Influence of thermal ageing on the physical, chemical and wear properties of the coating are also investigated. FTIR spectroscopic results show that the increase of PFA transfer onto the foam counterface is associated to the decrease of the friction coefficient. Increasing lubrication by film transfer results in the decrease of the friction coefficient. Moreover increasing the friction test parameters conditions (load, speed and sliding distance) also increase the film transfer onto the counterface. Thermal ageing increases the hydrophobic character of the PFA coating and thus also decreases the friction coefficient.Keywords: fluorobased polymer coatings, FTIR spectroscopy, non-stick food moulds, wear and friction
Procedia PDF Downloads 3313043 Parallelization of Random Accessible Progressive Streaming of Compressed 3D Models over Web
Authors: Aayushi Somani, Siba P. Samal
Abstract:
Three-dimensional (3D) meshes are data structures, which store geometric information of an object or scene, generally in the form of vertices and edges. Current technology in laser scanning and other geometric data acquisition technologies acquire high resolution sampling which leads to high resolution meshes. While high resolution meshes give better quality rendering and hence is used often, the processing, as well as storage of 3D meshes, is currently resource-intensive. At the same time, web applications for data processing have become ubiquitous owing to their accessibility. For 3D meshes, the advancement of 3D web technologies, such as WebGL, WebVR, has enabled high fidelity rendering of huge meshes. However, there exists a gap in ability to stream huge meshes to a native client and browser application due to high network latency. Also, there is an inherent delay of loading WebGL pages due to large and complex models. The focus of our work is to identify the challenges faced when such meshes are streamed into and processed on hand-held devices, owing to its limited resources. One of the solutions that are conventionally used in the graphics community to alleviate resource limitations is mesh compression. Our approach deals with a two-step approach for random accessible progressive compression and its parallel implementation. The first step includes partition of the original mesh to multiple sub-meshes, and then we invoke data parallelism on these sub-meshes for its compression. Subsequent threaded decompression logic is implemented inside the Web Browser Engine with modification of WebGL implementation in Chromium open source engine. This concept can be used to completely revolutionize the way e-commerce and Virtual Reality technology works for consumer electronic devices. These objects can be compressed in the server and can be transmitted over the network. The progressive decompression can be performed on the client device and rendered. Multiple views currently used in e-commerce sites for viewing the same product from different angles can be replaced by a single progressive model for better UX and smoother user experience. Can also be used in WebVR for commonly and most widely used activities like virtual reality shopping, watching movies and playing games. Our experiments and comparison with existing techniques show encouraging results in terms of latency (compressed size is ~10-15% of the original mesh), processing time (20-22% increase over serial implementation) and quality of user experience in web browser.Keywords: 3D compression, 3D mesh, 3D web, chromium, client-server architecture, e-commerce, level of details, parallelization, progressive compression, WebGL, WebVR
Procedia PDF Downloads 1703042 Teaching Tolerance in the Language Classroom through a Text
Authors: Natalia Kasatkina
Abstract:
In an ever-increasing globalization, one’s grasp of diversity and tolerance has never been more indispensable, and it is a vital duty for all those in the field of foreign language teaching to help children cultivate such values. The present study explores the role of DIVERSITY and TOLERANCE in the language classroom and elementary, middle, and high school students’ perceptions of these two concepts. It draws on several theoretical domains of language acquisition, cultural awareness, and school psychology. Relying on these frameworks, the major findings are synthesized, and a paradigm of teaching tolerance through language-teaching is formulated. Upon analysing how tolerant our children are with ‘others’ in and outside the classroom, we have concluded that intolerance and aggression towards the ‘other’ increase with age, and that a feeling of supremacy over migrants and a sense of fear towards them begin to manifest more apparently when the students are in high school. In addition, we have also found that children in elementary school do not exhibit such prejudiced thoughts and behavior, which leads us to the believe that tolerance as well as intolerance are learned. Therefore, it is within our reach to teach our children to be open-minded and accepting. We have used the novel ‘Uncle Tom’s Cabin’ by Harriet Beecher Stowe as a springboard for lessons which are not only targeted at shedding light on the role of language in the modern world, but also aim to stimulate an awareness of cultural diversity. We equally strive to conduct further cross-cultural research in order to solidify the theory behind this study, and thus devise a language-based curriculum which would encourage tolerance through the examination of various literary texts.Keywords: literary text, tolerance, EFL classroom, word-association test
Procedia PDF Downloads 2923041 Electrospun Membrane doped with Gold Nanorods for Surface-Enhanced Raman Sepctroscopy
Authors: Ziwei Wang, Andrea Lucotti, Luigi Brambilla, Matteo Tommasini, Chiara Bertarelli
Abstract:
Surface-enhanced Raman Spectroscopy (SERS) is a highly sensitive detection that provides abundant information on low concentration analytes from various researching areas. Based on localized surface plasmon resonance, metal nanostructures including gold, silver and copper have been investigated as SERS substrate during recent decades. There has been increasing more attention of exploring good performance, homogenous, repeatable SERS substrates. Here, we show that electrospinning, which is an inexpensive technique to fabricate large-scale, self-standing and repeatable membranes, can be effectively used for producing SERS substrates. Nanoparticles and nanorods are added to the feed electrospinning solution to collect functionalized polymer fibrous mats. We report stable electrospun membranes as SERS substrate using gold nanorods (AuNRs) and poly(vinyl alcohol). Particularly, a post-processing crosslinking step using glutaraldehyde under acetone environment was carried out to the electrospun membrane. It allows for using the membrane in any liquid environment, including water, which is of interest both for sensing of contaminant in wastewater, as well as for biosensing. This crosslinked AuNRs/PVA membrane has demonstrated excellent performance as SERS substrate for low concentration 10-6 M Rhodamine 6G (Rh6G) aqueous solution. This post-processing for fabricating SERS substrate is the first time reported and proved through Raman imaging of excellent stability and outstanding performance. Finally, SERS tests have been applied to several analytes, and the application of AuNRs/PVA membrane is broadened by removing the detected analyte by rinsing. Therefore, this crosslinked AuNRs/PVA membrane is re-usable.Keywords: SERS spectroscopy, electrospinning, crosslinking, composite materials
Procedia PDF Downloads 1403040 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic
Authors: Michael Lousis
Abstract:
The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors
Procedia PDF Downloads 3163039 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models
Authors: Danielle Shackley, Yetunde Folajimi
Abstract:
As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model
Procedia PDF Downloads 973038 Development of the Integrated Quality Management System of Cooked Sausage Products
Authors: Liubov Lutsyshyn, Yaroslava Zhukova
Abstract:
Over the past twenty years, there has been a drastic change in the mode of nutrition in many countries which has been reflected in the development of new products, production techniques, and has also led to the expansion of sales markets for food products. Studies have shown that solution of the food safety problems is almost impossible without the active and systematic activity of organizations directly involved in the production, storage and sale of food products, as well as without management of end-to-end traceability and exchange of information. The aim of this research is development of the integrated system of the quality management and safety assurance based on the principles of HACCP, traceability and system approach with creation of an algorithm for the identification and monitoring of parameters of technological process of manufacture of cooked sausage products. Methodology of implementation of the integrated system based on the principles of HACCP, traceability and system approach during the manufacturing of cooked sausage products for effective provision for the defined properties of the finished product has been developed. As a result of the research evaluation technique and criteria of performance of the implementation and operation of the system of the quality management and safety assurance based on the principles of HACCP have been developed and substantiated. In the paper regularities of influence of the application of HACCP principles, traceability and system approach on parameters of quality and safety of the finished product have been revealed. In the study regularities in identification of critical control points have been determined. The algorithm of functioning of the integrated system of the quality management and safety assurance has also been described and key requirements for the development of software allowing the prediction of properties of finished product, as well as the timely correction of the technological process and traceability of manufacturing flows have been defined. Based on the obtained results typical scheme of the integrated system of the quality management and safety assurance based on HACCP principles with the elements of end-to-end traceability and system approach for manufacture of cooked sausage products has been developed. As a result of the studies quantitative criteria for evaluation of performance of the system of the quality management and safety assurance have been developed. A set of guidance documents for the implementation and evaluation of the integrated system based on the HACCP principles in meat processing plants have also been developed. On the basis of the research the effectiveness of application of continuous monitoring of the manufacturing process during the control on the identified critical control points have been revealed. The optimal number of critical control points in relation to the manufacture of cooked sausage products has been substantiated. The main results of the research have been appraised during 2013-2014 under the conditions of seven enterprises of the meat processing industry and have been implemented at JSC «Kyiv meat processing plant».Keywords: cooked sausage products, HACCP, quality management, safety assurance
Procedia PDF Downloads 2473037 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel
Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani
Abstract:
Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry
Procedia PDF Downloads 2713036 Revolutionizing Healthcare Communication: The Transformative Role of Natural Language Processing and Artificial Intelligence
Authors: Halimat M. Ajose-Adeogun, Zaynab A. Bello
Abstract:
Artificial Intelligence (AI) and Natural Language Processing (NLP) have transformed computer language comprehension, allowing computers to comprehend spoken and written language with human-like cognition. NLP, a multidisciplinary area that combines rule-based linguistics, machine learning, and deep learning, enables computers to analyze and comprehend human language. NLP applications in medicine range from tackling issues in electronic health records (EHR) and psychiatry to improving diagnostic precision in orthopedic surgery and optimizing clinical procedures with novel technologies like chatbots. The technology shows promise in a variety of medical sectors, including quicker access to medical records, faster decision-making for healthcare personnel, diagnosing dysplasia in Barrett's esophagus, boosting radiology report quality, and so on. However, successful adoption requires training for healthcare workers, fostering a deep understanding of NLP components, and highlighting the significance of validation before actual application. Despite prevailing challenges, continuous multidisciplinary research and collaboration are critical for overcoming restrictions and paving the way for the revolutionary integration of NLP into medical practice. This integration has the potential to improve patient care, research outcomes, and administrative efficiency. The research methodology includes using NLP techniques for Sentiment Analysis and Emotion Recognition, such as evaluating text or audio data to determine the sentiment and emotional nuances communicated by users, which is essential for designing a responsive and sympathetic chatbot. Furthermore, the project includes the adoption of a Personalized Intervention strategy, in which chatbots are designed to personalize responses by merging NLP algorithms with specific user profiles, treatment history, and emotional states. The synergy between NLP and personalized medicine principles is critical for tailoring chatbot interactions to each user's demands and conditions, hence increasing the efficacy of mental health care. A detailed survey corroborated this synergy, revealing a remarkable 20% increase in patient satisfaction levels and a 30% reduction in workloads for healthcare practitioners. The poll, which focused on health outcomes and was administered to both patients and healthcare professionals, highlights the improved efficiency and favorable influence on the broader healthcare ecosystem.Keywords: natural language processing, artificial intelligence, healthcare communication, electronic health records, patient care
Procedia PDF Downloads 763035 An Analysis of Learners’ Reports for Measuring Co-Creational Education
Authors: Takatoshi Ishii, Koji Kimita, Keiichi Muramatsu, Yoshiki Shimomura
Abstract:
To increase the quality of learning, teacher and learner need mutual effort for realization of educational value. For this purpose, we need to manage the co-creational education among teacher and learners. In this research, we try to find a feature of co-creational education. To be more precise, we analyzed learners’ reports by natural language processing, and extract some features that describe the state of the co-creational education.Keywords: co-creational education, e-portfolios, ICT integration, latent dirichlet allocation
Procedia PDF Downloads 6223034 Exploring the Power of Words: Domesticating the Competence/Competency Concept in Ugandan Organisations
Authors: John C. Munene, Florence Nansubuga
Abstract:
The study set out to examine a number of theories that have directly or indirectly implied that words are potent but that the potency depends on the context or practice in which they are utilised. The theories include the Freudian theory of Cathexis, which directly suggests that ambiguous events when named become potent as well as the word that is used to name them. We briefly examine Psychological differentiation, which submit that ambiguity is often a result of failure to distinguish figure from ground. The investigate Prospecting Theory, which suggests that in a situation when people have to make decisions, they have options to utilise intuition or reasoned judgment. It suggests that more often than not, the tendency is to utilise intuition especially when generic heuristics such as representativeness and similarity are available. That usage of these heuristics may depend on lack of a salience or accessibility of the situation due to ambiguity. We also examine Activity Theory, which proposes that meaning of words emerge directly and dialectically from the activities in which they are used. The paper argues that the power of words will depend on either or all of the theories mentioned above. To examine this general proposition we test the utilization of a generic competence framework in a local setting. The assumption is that generic frameworks are inherently ambiguous and lack the potency normally associated with the competence concept in the management of human resources. A number of case studies provide initial supporting evidence for the general proposition.Keywords: competence, meaning, operationalisation, power of words
Procedia PDF Downloads 4123033 An EBSD Investigation of Ti-6Al-4Nb Alloy Processed by Plan Strain Compression Test
Authors: Anna Jastrzebska, K. S. Suresh, T. Kitashima, Y. Yamabe-Mitarai, Z. Pakiela
Abstract:
Near α titanium alloys are important materials for aerospace applications, especially in high temperature applications such as jet engine. Mechanical properties of Ti alloys strongly depends on their processing route, then it is very important to understand micro-structure change by different processing. In our previous study, Nb was found to improve oxidation resistance of Ti alloys. In this study, micro-structure evolution of Ti-6Al-4Nb (wt %) alloy was investigated after plain strain compression test in hot working temperatures in the α and β phase region. High-resolution EBSD was successfully used for precise phase and texture characterization of this alloy. 1.1 kg of Ti-6Al-4Nb ingot was prepared using cold crucible levitation melting. The ingot was subsequently homogenized in 1050 deg.C for 1h followed by cooling in the air. Plate like specimens measuring 10×20×50 mm3 were cut from an ingot by electrical discharge machining (EDM). The plain strain compression test using an anvil with 10 x 35 mm in size was performed with 3 different strain rates: 0.1s-1, 1s-1and 10s-1 in 700 deg.C and 1050 deg.C to obtain 75% of deformation. The micro-structure was investigated by scanning electron microscopy (SEM) equipped with electron backscatter diffraction (EBSD) detector. The α/β phase ratio and phase morphology as well as the crystallographic texture, subgrain size, misorientation angles and misorientation gradients corresponding to each phase were determined over the middle and the edge of sample areas. The deformation mechanism in each working temperature was discussed. The evolution of texture changes with strain rate was investigated. The micro-structure obtained by plain strain compression test was heterogeneous with a wide range of grain sizes. This is because deformation and dynamic recrystallization occurred during deformation at temperature in the α and β phase. It was strongly influenced by strain rate.Keywords: EBSD, plain strain compression test, Ti alloys
Procedia PDF Downloads 3803032 Quality Analysis of Vegetables Through Image Processing
Authors: Abdul Khalique Baloch, Ali Okatan
Abstract:
The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria
Procedia PDF Downloads 703031 Preliminary Study of the Phonological Development in Three and Four Year Old Bulgarian Children
Authors: Tsvetomira Braynova, Miglena Simonska
Abstract:
The article presents the results of research on phonological processes in three and four-year-old children. For the purpose of the study, an author's test was developed and conducted among 120 children. The study included three areas of research - at the level of words (96 words), at the level of sentence repetition (10 sentences) and at the level of generating own speech from a picture (15 pictures). The test also gives us additional information about the articulation errors of the assessed children. The main purpose of the icing is to analyze all phonological processes that occur at this age in Bulgarian children and to identify which are typical and atypical for this age. The results show that the most common phonology errors that children make are: sound substitution, an elision of sound, metathesis of sound, elision of a syllable, and elision of consonants clustered in a syllable. All examined children were identified with the articulatory disorder from type bilabial lambdacism. Measuring the correlation between the average length of repeated speech and the average length of generated speech, the analysis proves that the more words a child can repeat in part “repeated speech,” the more words they can be expected to generate in part “generating sentence.” The results of this study show that the task of naming a word provides sufficient and representative information to assess the child's phonology.Keywords: assessment, phonology, articulation, speech-language development
Procedia PDF Downloads 1863030 Deproteinization of Moroccan Sardine (Sardina pilchardus) Scales: A Pilot-Scale Study
Authors: F. Bellali, M. Kharroubi, Y. Rady, N. Bourhim
Abstract:
In Morocco, fish processing industry is an important source income for a large amount of by-products including skins, bones, heads, guts, and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Sardina plichardus scales from resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic, and biomedical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. And the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The advancement from lab scale to pilot scale is a critical stage in the technological development. In this study, the optimal condition for the deproteinization which was validated at laboratory scale was employed in the pilot scale procedure. The deproteinization of fish scale was then demonstrated on a pilot scale (2Kg scales, 20l NaOH), resulting in protein content (0,2mg/ml) and hydroxyproline content (2,11mg/l). These results indicated that the pilot-scale showed similar performances to those of lab-scale one.Keywords: deproteinization, pilot scale, scale, sardine pilchardus
Procedia PDF Downloads 4463029 Data Mining Spatial: Unsupervised Classification of Geographic Data
Authors: Chahrazed Zouaoui
Abstract:
In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.Keywords: mining, GIS, geo-clustering, neighborhood
Procedia PDF Downloads 3753028 Detecting Indigenous Languages: A System for Maya Text Profiling and Machine Learning Classification Techniques
Authors: Alejandro Molina-Villegas, Silvia Fernández-Sabido, Eduardo Mendoza-Vargas, Fátima Miranda-Pestaña
Abstract:
The automatic detection of indigenous languages in digital texts is essential to promote their inclusion in digital media. Underrepresented languages, such as Maya, are often excluded from language detection tools like Google’s language-detection library, LANGDETECT. This study addresses these limitations by developing a hybrid language detection solution that accurately distinguishes Maya (YUA) from Spanish (ES). Two strategies are employed: the first focuses on creating a profile for the Maya language within the LANGDETECT library, while the second involves training a Naive Bayes classification model with two categories, YUA and ES. The process includes comprehensive data preprocessing steps, such as cleaning, normalization, tokenization, and n-gram counting, applied to text samples collected from various sources, including articles from La Jornada Maya, a major newspaper in Mexico and the only media outlet that includes a Maya section. After the training phase, a portion of the data is used to create the YUA profile within LANGDETECT, which achieves an accuracy rate above 95% in identifying the Maya language during testing. Additionally, the Naive Bayes classifier, trained and tested on the same database, achieves an accuracy close to 98% in distinguishing between Maya and Spanish, with further validation through F1 score, recall, and logarithmic scoring, without signs of overfitting. This strategy, which combines the LANGDETECT profile with a Naive Bayes model, highlights an adaptable framework that can be extended to other underrepresented languages in future research. This fills a gap in Natural Language Processing and supports the preservation and revitalization of these languages.Keywords: indigenous languages, language detection, Maya language, Naive Bayes classifier, natural language processing, low-resource languages
Procedia PDF Downloads 163027 Additive Manufacturing – Application to Next Generation Structured Packing (SpiroPak)
Authors: Biao Sun, Tejas Bhatelia, Vishnu Pareek, Ranjeet Utikar, Moses Tadé
Abstract:
Additive manufacturing (AM), commonly known as 3D printing, with the continuing advances in parallel processing and computational modeling, has created a paradigm shift (with significant radical thinking) in the design and operation of chemical processing plants, especially LNG plants. With the rising energy demands, environmental pressures, and economic challenges, there is a continuing industrial need for disruptive technologies such as AM, which possess capabilities that can drastically reduce the cost of manufacturing and operations of chemical processing plants in the future. However, the continuing challenge for 3D printing is its lack of adaptability in re-designing the process plant equipment coupled with the non-existent theory or models that could assist in selecting the optimal candidates out of the countless potential fabrications that are possible using AM. One of the most common packings used in the LNG process is structured packing in the packed column (which is a unit operation) in the process. In this work, we present an example of an optimum strategy for the application of AM to this important unit operation. Packed columns use a packing material through which the gas phase passes and comes into contact with the liquid phase flowing over the packing, typically performing the necessary mass transfer to enrich the products, etc. Structured packing consists of stacks of corrugated sheets, typically inclined between 40-70° from the plane. Computational Fluid Dynamics (CFD) was used to test and model various geometries to study the governing hydrodynamic characteristics. The results demonstrate that the costly iterative experimental process can be minimized. Furthermore, they also improve the understanding of the fundamental physics of the system at the multiscale level. SpiroPak, patented by Curtin University, represents an innovative structured packing solution currently at a technology readiness level (TRL) of 5~6. This packing exhibits remarkable characteristics, offering a substantial increase in surface area while significantly enhancing hydrodynamic and mass transfer performance. Recent studies have revealed that SpiroPak can reduce pressure drop by 50~70% compared to commonly used commercial packings, and it can achieve 20~50% greater mass transfer efficiency (particularly in CO2 absorption applications). The implementation of SpiroPak has the potential to reduce the overall size of columns and decrease power consumption, resulting in cost savings for both capital expenditure (CAPEX) and operational expenditure (OPEX) when applied to retrofitting existing systems or incorporated into new processes. Furthermore, pilot to large-scale tests is currently underway to further advance and refine this technology.Keywords: Additive Manufacturing (AM), 3D printing, Computational Fluid Dynamics (CFD, structured packing (SpiroPak)
Procedia PDF Downloads 873026 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters
Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev
Abstract:
Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters
Procedia PDF Downloads 1973025 Harnessing the Benefits and Mitigating the Challenges of Neurosensitivity for Learners: A Mixed Methods Study
Authors: Kaaryn Cater
Abstract:
People vary in how they perceive, process, and react to internal, external, social, and emotional environmental factors; some are more sensitive than others. Compassionate people have a highly reactive nervous system and are more impacted by positive and negative environmental conditions (Differential Susceptibility). Further, some sensitive individuals are disproportionately able to benefit from positive and supportive environments without necessarily suffering negative impacts in less supportive environments (Vantage Sensitivity). Environmental sensitivity is underpinned by physiological, genetic, and personality/temperamental factors, and the phenotypic expression of high sensitivity is Sensory Processing Sensitivity. The hallmarks of Sensory Processing Sensitivity are deep cognitive processing, emotional reactivity, high levels of empathy, noticing environmental subtleties, a tendency to observe new and novel situations, and a propensity to become overwhelmed when over-stimulated. Several educational advantages associated with high sensitivity include creativity, enhanced memory, divergent thinking, giftedness, and metacognitive monitoring. High sensitivity can also lead to some educational challenges, particularly managing multiple conflicting demands and negotiating low sensory thresholds. A mixed methods study was undertaken. In the first quantitative study, participants completed the Perceived Success in Study Survey (PSISS) and the Highly Sensitive Person Scale (HSPS-12). Inclusion criteria were current or previous postsecondary education experience. The survey was presented on social media, and snowball recruitment was employed (n=365). The Excel spreadsheets were uploaded to the statistical package for the social sciences (SPSS)26, and descriptive statistics found normal distribution. T-tests and analysis of variance (ANOVA) calculations found no difference in the responses of demographic groups, and Principal Components Analysis and the posthoc Tukey calculations identified positive associations between high sensitivity and three of the five PSISS factors. Further ANOVA calculations found positive associations between the PSISS and two of the three sensitivity subscales. This study included a response field to register interest in further research. Respondents who scored in the 70th percentile on the HSPS-12 were invited to participate in a semi-structured interview. Thirteen interviews were conducted remotely (12 female). Reflexive inductive thematic analysis was employed to analyse data, and a descriptive approach was employed to present data reflective of participant experience. The results of this study found that compassionate students prioritize work-life balance; employ a range of practical metacognitive study and self-care strategies; value independent learning; connect with learning that is meaningful; and are bothered by aspects of the physical learning environment, including lighting, noise, and indoor environmental pollutants. There is a dearth of research investigating sensitivity in the educational context, and these studies highlight the need to promote widespread education sector awareness of environmental sensitivity, and the need to include sensitivity in sector and institutional diversity and inclusion initiatives.Keywords: differential susceptibility, highly sensitive person, learning, neurosensitivity, sensory processing sensitivity, vantage sensitivity
Procedia PDF Downloads 653024 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 323023 Marketing Practices of the Urban and Recycled Wood Industry in the United States
Authors: Robert Smith, Omar Espinoza, Anna Pitta
Abstract:
In the United States, trees felled in urban areas and wood generated through construction and demolition are primarily disposed of as low-value resources, such as biomass for energy, landscaping mulch, composting, or landfilled. An emerging industry makes use of these underutilized resources to produce high value-added products, with associated benefits for the environment, the local economy, and consumers. For the circular economy to be successful, markets must be created for sustainable, reusable natural materials. Research was carried out to increase the understanding of the marketing practices of urban and reclaimed wood industries. This paper presents the results of a nationwide survey of these companies. The results indicate that a majority of companies in this industry are small firms, operating for less than 10 years, which produce mostly to order and sell their products at comparatively higher prices than competing products made from virgin natural resources. Promotional messages included quality, aesthetics, and customization, conveyed through company webpages, word of mouth, and social media. Distribution channels used include direct sales, online sales, and retail sales. Partnerships are critical for effective raw material procurement. Respondents indicated optimistic growth expectations, despite barriers associated with urban and reclaimed wood materials and production.Keywords: urban and reclaimed wood, circular economy, marketing, wood products
Procedia PDF Downloads 1253022 Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence
Authors: Sylvester Akpah, Selasi Vondee
Abstract:
Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle.Keywords: artificial ntelligence, chatbot, natural language processing, unmanned aerial vehicle
Procedia PDF Downloads 1423021 Processing, Nutritional Assessment and Sensory Evaluation of Bakery Products Prepared from Orange Fleshed Sweet Potatoes (OFSP) and Wheat Composite Flours
Authors: Hategekimana Jean Paul, Irakoze Josiane, Ishimweyizerwe Valentin, Iradukunda Dieudonne, Uwanyirigira Jeannette
Abstract:
Orange fleshed sweet potatoes (OFSP) are highly grown and are available plenty in rural and urban local markets and its contribution in reduction of food insecurity in Rwanda is considerable. But the postharvest loss of this commodity is a critical challenge due to its high perishability. Several research activities have been conducted on how fresh food commodities can be transformed into extended shelf life food products for prevention of post-harvest losses. However, such activity was not yet well studied in Rwanda. The aim of the present study was the processing of backed products from (OFSP)combined with wheat composite flour and assess the nutritional content and consumer acceptability of new developed products. The perishability of OFSP and their related lack during off season can be eradicated by producing cake, doughnut and bread with OFSP puree or flour. The processing for doughnut and bread were made by making OFSP puree and other ingredients then a dough was made followed by frying and baking while for cake OFSP was dried through solar dryer to have a flour together with wheat flour and other ingredients to make dough cake and baking. For each product, one control and three experimental samples, (three products in three different ratios (30,40 and50%) of OFSP and the remaining percentage of wheat flour) were prepared. All samples including the control were analyzed for the consumer acceptability (sensory attributes). Most preferred samples (One sample for each product with its control sample and for each OFSP variety) were analyzed for nutritional composition along with control sample. The Cake from Terimbere variety and Bread from Gihingumukungu supplemented with 50% OFSP flour or Puree respectively were most acceptable except Doughnut from Vita variety which was highly accepted at 50% of OFSP supplementation. The moisture, ash, protein, fat, fiber, Total carbohydrate, Vitamin C, reducing sugar and minerals (Sodium, Potassium and Phosphorus.) content was different among products. Cake was rich in fibers (14.71%), protein (6.590%), and vitamin c(19.988mg/100g) compared to other samples while bread found to be rich in reducing sugar with 12.71mg/100g compared to cake and doughnut. Also doughnut was found to be rich in fat content with 6.89% compared to other samples. For sensory analysis, doughnut was highly accepted in ratio of 60:40 compared to other products while cake was least accepted at ratio of 50:50. The Proximate composition and minerals content of all the OFSP products were significantly higher as compared to the control samples.Keywords: post-harvest loss, OFSP products, wheat flour, sensory evaluation, proximate composition
Procedia PDF Downloads 623020 Multilingualism and Unification of Teaching
Authors: Mehdi Damaliamiri, Firouzeh Akbari
Abstract:
Teaching literature to children at an early age is of great importance, and there have been different methods to facilitate learning literature. Based on the law, all children going to school in Iran should learn the Persian language and literature. This has been concomitant with two different levels of learning related to urban or rural bilingualism. For bilingual children living in the villages, learning literature and a new language (Persian) turns into a big challenge as it is done based on the translation the teacher does while in the city, it is easier as the confrontation of children with the Persian language is more. Over recent years, to change the trend of learning Persian by children speaking another language, the TV and radio programs have been considered to be effective, but the scores of the students in Persian language national exams show that these programs have not been so effective for the bilingual students living in the villages. To identify the determinants of weak learning of Persian by bilingual children, two different regions were chosen, Turkish-speaking and Kurdish-speaking communities, to compare their learning of Persian at the first and second levels of elementary school. The criteria of learning was based on the syllabification of Persian words, word order in the sentence, and compound sentences. Students were taught in Persian how to recognize syllabification without letting them translate the words in their own languages and were asked to produce simple sentences in Persian in response to situational questions. Teaching methods, language relatedness with Persian, and exposure to social media programs, especially TV and radio, were the factors that were considered to affect the potential of children in learning Persian.Keywords: bilingualism, persian, education, Literature
Procedia PDF Downloads 733019 Signal Processing of the Blood Pressure and Characterization
Authors: Hadj Abd El Kader Benghenia, Fethi Bereksi Reguig
Abstract:
In clinical medicine, blood pressure, raised blood hemodynamic monitoring is rich pathophysiological information of cardiovascular system, of course described through factors such as: blood volume, arterial compliance and peripheral resistance. In this work, we are interested in analyzing these signals to propose a detection algorithm to delineate the different sequences and especially systolic blood pressure (SBP), diastolic blood pressure (DBP), and the wave and dicrotic to do their analysis in order to extract the cardiovascular parameters.Keywords: blood pressure, SBP, DBP, detection algorithm
Procedia PDF Downloads 4393018 Survey of Communication Technologies for IoT Deployments in Developing Regions
Authors: Namugenyi Ephrance Eunice, Julianne Sansa Otim, Marco Zennaro, Stephen D. Wolthusen
Abstract:
The Internet of Things (IoT) is a network of connected data processing devices, mechanical and digital machinery, items, animals, or people that may send data across a network without requiring human-to-human or human-to-computer interaction. Each component has sensors that can pick up on specific phenomena, as well as processing software and other technologies that can link to and communicate with other systems and/or devices over the Internet or other communication networks and exchange data with them. IoT is increasingly being used in fields other than consumer electronics, such as public safety, emergency response, industrial automation, autonomous vehicles, the Internet of Medical Things (IoMT), and general environmental monitoring. Consumer-based IoT applications, like smart home gadgets and wearables, are also becoming more prevalent. This paper presents the main IoT deployment areas for environmental monitoring in developing regions and the backhaul options suitable for them. A detailed review of each of the list of papers selected for the study is included in section III of this document. The study includes an overview of existing IoT deployments, the underlying communication architectures, protocols, and technologies that support them. This overview shows that Low Power Wireless Area Networks (LPWANs), as summarized in Table 1, are very well suited for monitoring environment architectures designed for remote locations. LoRa technology, particularly the LoRaWAN protocol, has an advantage over other technologies due to its low power consumption, adaptability, and suitable communication range. The prevailing challenges of the different architectures are discussed and summarized in Table 3 of the IV section, where the main problem is the obstruction of communication paths by buildings, trees, hills, etc.Keywords: communication technologies, environmental monitoring, Internet of Things, IoT deployment challenges
Procedia PDF Downloads 853017 Cataphora in English and Chinese Conversation: A Corpus-based Contrastive Study
Authors: Jun Gao
Abstract:
This paper combines the corpus-based and contrastive approaches, seeking to provide a systematic account of cataphora in English and Chinese natural conversations. Based on spoken corpus data, the first part of the paper examines a range of characteristics of cataphora in the two languages, including frequency of occurrence, patterns, and syntactic features. On the basis of this exploration, cataphora in the two languages are contrasted in a structured way. The analysis shows that English and Chinese share a similar distribution of cataphora in natural conversations in terms of frequency of occurrence, with repeat identification cataphora higher than first mention cataphora and intra-sentential cataphora much higher than inter-sentential cataphora. In terms of patterns, three types are identified in English, i.e. P+N, Ø+N, and it+Clause, while in Chinese, two types are identified, i.e., P+N and Ø+N. English and Chinese are similar in terms of syntactic features, i.e., cataphor and postcedent in the intra-sentential cataphora mainly occur in the initial subject position of the same clause, with postcedent immediately followed or delayed, and cataphor and postcedent are mostly in adjacent sentences in inter-sentential cataphora. In the second part of the paper, the motivations of cataphora are investigated. It is found that cataphora is primarily motivated by the speaker and hearer’s different knowledge states with regard to the referent. Other factors are also involved, such as interference, word search, and the tension between the principles of Economy and Clarity.Keywords: cataphora, contrastive study, motivation, pattern, syntactic features
Procedia PDF Downloads 81