Search results for: phonological processing
2820 Detecting Indigenous Languages: A System for Maya Text Profiling and Machine Learning Classification Techniques
Authors: Alejandro Molina-Villegas, Silvia Fernández-Sabido, Eduardo Mendoza-Vargas, Fátima Miranda-Pestaña
Abstract:
The automatic detection of indigenous languages in digital texts is essential to promote their inclusion in digital media. Underrepresented languages, such as Maya, are often excluded from language detection tools like Google’s language-detection library, LANGDETECT. This study addresses these limitations by developing a hybrid language detection solution that accurately distinguishes Maya (YUA) from Spanish (ES). Two strategies are employed: the first focuses on creating a profile for the Maya language within the LANGDETECT library, while the second involves training a Naive Bayes classification model with two categories, YUA and ES. The process includes comprehensive data preprocessing steps, such as cleaning, normalization, tokenization, and n-gram counting, applied to text samples collected from various sources, including articles from La Jornada Maya, a major newspaper in Mexico and the only media outlet that includes a Maya section. After the training phase, a portion of the data is used to create the YUA profile within LANGDETECT, which achieves an accuracy rate above 95% in identifying the Maya language during testing. Additionally, the Naive Bayes classifier, trained and tested on the same database, achieves an accuracy close to 98% in distinguishing between Maya and Spanish, with further validation through F1 score, recall, and logarithmic scoring, without signs of overfitting. This strategy, which combines the LANGDETECT profile with a Naive Bayes model, highlights an adaptable framework that can be extended to other underrepresented languages in future research. This fills a gap in Natural Language Processing and supports the preservation and revitalization of these languages.Keywords: indigenous languages, language detection, Maya language, Naive Bayes classifier, natural language processing, low-resource languages
Procedia PDF Downloads 162819 Additive Manufacturing – Application to Next Generation Structured Packing (SpiroPak)
Authors: Biao Sun, Tejas Bhatelia, Vishnu Pareek, Ranjeet Utikar, Moses Tadé
Abstract:
Additive manufacturing (AM), commonly known as 3D printing, with the continuing advances in parallel processing and computational modeling, has created a paradigm shift (with significant radical thinking) in the design and operation of chemical processing plants, especially LNG plants. With the rising energy demands, environmental pressures, and economic challenges, there is a continuing industrial need for disruptive technologies such as AM, which possess capabilities that can drastically reduce the cost of manufacturing and operations of chemical processing plants in the future. However, the continuing challenge for 3D printing is its lack of adaptability in re-designing the process plant equipment coupled with the non-existent theory or models that could assist in selecting the optimal candidates out of the countless potential fabrications that are possible using AM. One of the most common packings used in the LNG process is structured packing in the packed column (which is a unit operation) in the process. In this work, we present an example of an optimum strategy for the application of AM to this important unit operation. Packed columns use a packing material through which the gas phase passes and comes into contact with the liquid phase flowing over the packing, typically performing the necessary mass transfer to enrich the products, etc. Structured packing consists of stacks of corrugated sheets, typically inclined between 40-70° from the plane. Computational Fluid Dynamics (CFD) was used to test and model various geometries to study the governing hydrodynamic characteristics. The results demonstrate that the costly iterative experimental process can be minimized. Furthermore, they also improve the understanding of the fundamental physics of the system at the multiscale level. SpiroPak, patented by Curtin University, represents an innovative structured packing solution currently at a technology readiness level (TRL) of 5~6. This packing exhibits remarkable characteristics, offering a substantial increase in surface area while significantly enhancing hydrodynamic and mass transfer performance. Recent studies have revealed that SpiroPak can reduce pressure drop by 50~70% compared to commonly used commercial packings, and it can achieve 20~50% greater mass transfer efficiency (particularly in CO2 absorption applications). The implementation of SpiroPak has the potential to reduce the overall size of columns and decrease power consumption, resulting in cost savings for both capital expenditure (CAPEX) and operational expenditure (OPEX) when applied to retrofitting existing systems or incorporated into new processes. Furthermore, pilot to large-scale tests is currently underway to further advance and refine this technology.Keywords: Additive Manufacturing (AM), 3D printing, Computational Fluid Dynamics (CFD, structured packing (SpiroPak)
Procedia PDF Downloads 872818 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters
Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev
Abstract:
Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters
Procedia PDF Downloads 1972817 Harnessing the Benefits and Mitigating the Challenges of Neurosensitivity for Learners: A Mixed Methods Study
Authors: Kaaryn Cater
Abstract:
People vary in how they perceive, process, and react to internal, external, social, and emotional environmental factors; some are more sensitive than others. Compassionate people have a highly reactive nervous system and are more impacted by positive and negative environmental conditions (Differential Susceptibility). Further, some sensitive individuals are disproportionately able to benefit from positive and supportive environments without necessarily suffering negative impacts in less supportive environments (Vantage Sensitivity). Environmental sensitivity is underpinned by physiological, genetic, and personality/temperamental factors, and the phenotypic expression of high sensitivity is Sensory Processing Sensitivity. The hallmarks of Sensory Processing Sensitivity are deep cognitive processing, emotional reactivity, high levels of empathy, noticing environmental subtleties, a tendency to observe new and novel situations, and a propensity to become overwhelmed when over-stimulated. Several educational advantages associated with high sensitivity include creativity, enhanced memory, divergent thinking, giftedness, and metacognitive monitoring. High sensitivity can also lead to some educational challenges, particularly managing multiple conflicting demands and negotiating low sensory thresholds. A mixed methods study was undertaken. In the first quantitative study, participants completed the Perceived Success in Study Survey (PSISS) and the Highly Sensitive Person Scale (HSPS-12). Inclusion criteria were current or previous postsecondary education experience. The survey was presented on social media, and snowball recruitment was employed (n=365). The Excel spreadsheets were uploaded to the statistical package for the social sciences (SPSS)26, and descriptive statistics found normal distribution. T-tests and analysis of variance (ANOVA) calculations found no difference in the responses of demographic groups, and Principal Components Analysis and the posthoc Tukey calculations identified positive associations between high sensitivity and three of the five PSISS factors. Further ANOVA calculations found positive associations between the PSISS and two of the three sensitivity subscales. This study included a response field to register interest in further research. Respondents who scored in the 70th percentile on the HSPS-12 were invited to participate in a semi-structured interview. Thirteen interviews were conducted remotely (12 female). Reflexive inductive thematic analysis was employed to analyse data, and a descriptive approach was employed to present data reflective of participant experience. The results of this study found that compassionate students prioritize work-life balance; employ a range of practical metacognitive study and self-care strategies; value independent learning; connect with learning that is meaningful; and are bothered by aspects of the physical learning environment, including lighting, noise, and indoor environmental pollutants. There is a dearth of research investigating sensitivity in the educational context, and these studies highlight the need to promote widespread education sector awareness of environmental sensitivity, and the need to include sensitivity in sector and institutional diversity and inclusion initiatives.Keywords: differential susceptibility, highly sensitive person, learning, neurosensitivity, sensory processing sensitivity, vantage sensitivity
Procedia PDF Downloads 652816 Gnss Aided Photogrammetry for Digital Mapping
Authors: Muhammad Usman Akram
Abstract:
This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions.Keywords: photogrammetry, post processing kinematics, real time kinematics, manual data inquiry
Procedia PDF Downloads 312815 Cross-Language Variation and the ‘Fused’ Zone in Bilingual Mental Lexicon: An Experimental Research
Authors: Yuliya E. Leshchenko, Tatyana S. Ostapenko
Abstract:
Language variation is a widespread linguistic phenomenon which can affect different levels of a language system: phonological, morphological, lexical, syntactic, etc. It is obvious that the scope of possible standard alternations within a particular language is limited by a variety of its norms and regulations which set more or less clear boundaries for what is possible and what is not possible for the speakers. The possibility of lexical variation (alternate usage of lexical items within the same contexts) is based on the fact that the meanings of words are not clearly and rigidly defined in the consciousness of the speakers. Therefore, lexical variation is usually connected with unstable relationship between words and their referents: a case when a particular lexical item refers to different types of referents, or when a particular referent can be named by various lexical items. We assume that the scope of lexical variation in bilingual speech is generally wider than that observed in monolingual speech due to the fact that, besides ‘lexical item – referent’ relations it involves the possibility of cross-language variation of L1 and L2 lexical items. We use the term ‘cross-language variation’ to denote a case when two equivalent words of different languages are treated by a bilingual speaker as freely interchangeable within the common linguistic context. As distinct from code-switching which is traditionally defined as the conscious use of more than one language within one communicative act, in case of cross-language lexical variation the speaker does not perceive the alternate lexical items as belonging to different languages and, therefore, does not realize the change of language code. In the paper, the authors present research of lexical variation of adult Komi-Permyak – Russian bilingual speakers. The two languages co-exist on the territory of the Komi-Permyak District in Russia (Komi-Permyak as the ethnic language and Russian as the official state language), are usually acquired from birth in natural linguistic environment and, according to the data of sociolinguistic surveys, are both identified by the speakers as coordinate mother tongues. The experimental research demonstrated that alternation of Komi-Permyak and Russian words within one utterance/phrase is highly frequent both in speech perception and production. Moreover, our participants estimated cross-language word combinations like ‘маленькая /Russian/ нывка /Komi-Permyak/’ (‘a little girl’) or ‘мунны /Komi-Permyak/ домой /Russian/’ (‘go home’) as regular/habitual, containing no violation of any linguistic rules and being equally possible in speech as the equivalent intra-language word combinations (‘учöтик нывка’ /Komi-Permyak/ or ‘идти домой’ /Russian/). All the facts considered, we claim that constant concurrent use of the two languages results in the fact that a large number of their words tend to be intuitively interpreted by the speakers as lexical variants not only related to the same referent, but also referring to both languages or, more precisely, to none of them in particular. Consequently, we can suppose that bilingual mental lexicon includes an extensive ‘fused’ zone of lexical representations that provide the basis for cross-language variation in bilingual speech.Keywords: bilingualism, bilingual mental lexicon, code-switching, lexical variation
Procedia PDF Downloads 1482814 Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence
Authors: Sylvester Akpah, Selasi Vondee
Abstract:
Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle.Keywords: artificial ntelligence, chatbot, natural language processing, unmanned aerial vehicle
Procedia PDF Downloads 1422813 Processing, Nutritional Assessment and Sensory Evaluation of Bakery Products Prepared from Orange Fleshed Sweet Potatoes (OFSP) and Wheat Composite Flours
Authors: Hategekimana Jean Paul, Irakoze Josiane, Ishimweyizerwe Valentin, Iradukunda Dieudonne, Uwanyirigira Jeannette
Abstract:
Orange fleshed sweet potatoes (OFSP) are highly grown and are available plenty in rural and urban local markets and its contribution in reduction of food insecurity in Rwanda is considerable. But the postharvest loss of this commodity is a critical challenge due to its high perishability. Several research activities have been conducted on how fresh food commodities can be transformed into extended shelf life food products for prevention of post-harvest losses. However, such activity was not yet well studied in Rwanda. The aim of the present study was the processing of backed products from (OFSP)combined with wheat composite flour and assess the nutritional content and consumer acceptability of new developed products. The perishability of OFSP and their related lack during off season can be eradicated by producing cake, doughnut and bread with OFSP puree or flour. The processing for doughnut and bread were made by making OFSP puree and other ingredients then a dough was made followed by frying and baking while for cake OFSP was dried through solar dryer to have a flour together with wheat flour and other ingredients to make dough cake and baking. For each product, one control and three experimental samples, (three products in three different ratios (30,40 and50%) of OFSP and the remaining percentage of wheat flour) were prepared. All samples including the control were analyzed for the consumer acceptability (sensory attributes). Most preferred samples (One sample for each product with its control sample and for each OFSP variety) were analyzed for nutritional composition along with control sample. The Cake from Terimbere variety and Bread from Gihingumukungu supplemented with 50% OFSP flour or Puree respectively were most acceptable except Doughnut from Vita variety which was highly accepted at 50% of OFSP supplementation. The moisture, ash, protein, fat, fiber, Total carbohydrate, Vitamin C, reducing sugar and minerals (Sodium, Potassium and Phosphorus.) content was different among products. Cake was rich in fibers (14.71%), protein (6.590%), and vitamin c(19.988mg/100g) compared to other samples while bread found to be rich in reducing sugar with 12.71mg/100g compared to cake and doughnut. Also doughnut was found to be rich in fat content with 6.89% compared to other samples. For sensory analysis, doughnut was highly accepted in ratio of 60:40 compared to other products while cake was least accepted at ratio of 50:50. The Proximate composition and minerals content of all the OFSP products were significantly higher as compared to the control samples.Keywords: post-harvest loss, OFSP products, wheat flour, sensory evaluation, proximate composition
Procedia PDF Downloads 622812 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 2192811 Signal Processing of the Blood Pressure and Characterization
Authors: Hadj Abd El Kader Benghenia, Fethi Bereksi Reguig
Abstract:
In clinical medicine, blood pressure, raised blood hemodynamic monitoring is rich pathophysiological information of cardiovascular system, of course described through factors such as: blood volume, arterial compliance and peripheral resistance. In this work, we are interested in analyzing these signals to propose a detection algorithm to delineate the different sequences and especially systolic blood pressure (SBP), diastolic blood pressure (DBP), and the wave and dicrotic to do their analysis in order to extract the cardiovascular parameters.Keywords: blood pressure, SBP, DBP, detection algorithm
Procedia PDF Downloads 4392810 Survey of Communication Technologies for IoT Deployments in Developing Regions
Authors: Namugenyi Ephrance Eunice, Julianne Sansa Otim, Marco Zennaro, Stephen D. Wolthusen
Abstract:
The Internet of Things (IoT) is a network of connected data processing devices, mechanical and digital machinery, items, animals, or people that may send data across a network without requiring human-to-human or human-to-computer interaction. Each component has sensors that can pick up on specific phenomena, as well as processing software and other technologies that can link to and communicate with other systems and/or devices over the Internet or other communication networks and exchange data with them. IoT is increasingly being used in fields other than consumer electronics, such as public safety, emergency response, industrial automation, autonomous vehicles, the Internet of Medical Things (IoMT), and general environmental monitoring. Consumer-based IoT applications, like smart home gadgets and wearables, are also becoming more prevalent. This paper presents the main IoT deployment areas for environmental monitoring in developing regions and the backhaul options suitable for them. A detailed review of each of the list of papers selected for the study is included in section III of this document. The study includes an overview of existing IoT deployments, the underlying communication architectures, protocols, and technologies that support them. This overview shows that Low Power Wireless Area Networks (LPWANs), as summarized in Table 1, are very well suited for monitoring environment architectures designed for remote locations. LoRa technology, particularly the LoRaWAN protocol, has an advantage over other technologies due to its low power consumption, adaptability, and suitable communication range. The prevailing challenges of the different architectures are discussed and summarized in Table 3 of the IV section, where the main problem is the obstruction of communication paths by buildings, trees, hills, etc.Keywords: communication technologies, environmental monitoring, Internet of Things, IoT deployment challenges
Procedia PDF Downloads 852809 Valorization of Underutilized Fish Species Through a Multidisciplinary Approach
Authors: Tiziana Pepe, Gerardo Manfreda, Adriana Ianieri, Aniello Anastasio
Abstract:
The sustainable exploitation of marine biological resources is among the most important objectives of the EU's Common Fisheries Policy (CFP). Currently, Europe imports about 65% of its fish products, indicating that domestic production does not meet consumer demand. Despite the availability of numerous commercially significant fish species, European consumption is concentrated on a limited number of products (e.g., sea bass, sea bream, shrimp). Many native species, present in large quantities in the Mediterranean Sea, are little known to consumers and are therefore considered ‘fishing by-products’. All the data presented so far indicate a significant waste of local resources and the overexploitation of a few fish stocks. It is therefore necessary to develop strategies that guide the market towards sustainable conversion. The objective of this work was to valorize underutilized fish species of the Mediterranean Sea through a multidisciplinary approach. To this end, three fish species were sampled: Atlantic Horse Mackerel (Trachurus trachurus), Bogue (Boops boops), and Common Dolphinfish (Coryphaena hippurus). Nutritional properties (water %, fats, proteins, ashes, salts), physical/chemical properties (TVB-N, histamine, pH), and rheological properties (color, texture, viscosity) were analyzed. The analyses were conducted on both fillets and processing by-products. Additionally, mitochondrial DNA (mtDNA) was extracted from the muscle of each species. The mtDNA was then sequenced using the Illumina NGS technique. The analysis of nutritional properties classified the fillets of the sampled species as lean or semi-fat, as they had a fat content of less than 3%, while the by-products showed a higher lipid content (2.7-5%). The protein percentage for all fillets was 22-23%, while for processing by-products, the protein concentration was 18-19% for all species. Rheological analyses showed an increase in viscosity in saline solution in all species, indicating their potential suitability for industrial processing. High-quality and quantity complete mtDNA was extracted from all analyzed species. The complete mitochondrial genome sequences were successfully obtained and annotated. The results of this study suggest that all analyzed species are suitable for both human consumption and feed production. The sequencing of the complete mtDNA and its availability in international databases will be useful for accurate phylogenetic analysis and proper species identification, even in prepared and processed products. Underutilized fish species represent an important economic resource. Encouraging their consumption could limit the phenomenon of overfishing, protecting marine biodiversity. Furthermore, the valorization of these species will increase national fish production, supporting the local economy, cultural, and gastronomic tradition, and optimizing the exploitation of Mediterranean resources in accordance with the CFP.Keywords: mtDNA, nutritional analysis, sustainable fisheries, underutilized fish species
Procedia PDF Downloads 302808 Effect of Processing Parameters on the Physical Properties of Pineapple Pomace Based Aquafeed
Authors: Oluwafemi Babatunde Oduntan, Isaac A. Bamgboye
Abstract:
The solid waste disposal and its management from pineapple juice processing constitute environmental contamination affecting public health. The use of this by-product called pomace has potentials to reduce cost of aquafeed. Pineapple pomace collected after juice extraction was dried and milled. The interactive effects of feeding rate (1.28, 1.44 and 1.60kg/min), screw speed (305, 355 and 405rpm), moisture content (16, 19 and 22%), temperatures (60, 80, 100 and 120°C), cutting speed (1300, 1400 and 1500rpm), pomace inclusion ratio (5, 10, 15, 20%) and open surface die (50, 75 and 100%) on the extrudate physical properties (bulk density, unit density, expansion ratio, durability and floatability) were investigated using optimal custom design (OCD) matrix and response surface methodology. The predicted values were found to be in good agreement with the experimental values for, expansion ratio, durability and floatability (R2 = 0.7970; 0.9264; 0.9098 respectively) with the exceptions of unit density and bulk density (R2 = 0.1639; 0.2768 respectively). All the extrudates showed relatively high floatability, durability. The inclusion of pineapple pomace produced less expanded and more compact textured extrudates. Results indicated that increased in the value of pineapple pomace, screw speed, feeding rate decreased unit density, bulk density, expansion ratio, durability and floatability of the extrudate. However, increasing moisture content of feed mash resulted in increase unit density and bulk density. Addition of extrusion temperature and cutting speed increased the floatability and durability of extrudate. The proportion of pineapple pomace in aquafeed extruded product was observed to have significantly lower effect on the selected responses.Keywords: aquafeed, extrusion, physical properties, pineapple pomace, waste
Procedia PDF Downloads 2712807 Analysis of Magnetic Anomaly Data for Identification Structure in Subsurface of Geothermal Manifestation at Candi Umbul Area, Magelang, Central Java Province, Indonesia
Authors: N. A. Kharisa, I. Wulandari, R. Narendratama, M. I. Faisal, K. Kirana, R. Zipora, I. Arfiansah, I. Suyanto
Abstract:
Acquisition of geophysical survey with magnetic method has been done in manifestation of geothermalat Candi Umbul, Grabag, Magelang, Central Java Province on 10-12 May 2013. This objective research is interpretation to interpret structural geology that control geothermal system in CandiUmbul area. The research has been finished with area size 1,5 km x 2 km and measurement space of 150 m. And each point of line space survey is 150 m using PPM Geometrics model G-856. Data processing was started with IGRF and diurnal variation correction to get total magnetic field anomaly. Then, advance processing was done until reduction to pole, upward continuation, and residual anomaly. That results become next interpretation in qualitative step. It is known that the biggest object position causes low anomaly located in central of area survey that comes from hot spring manifestation and demagnetization zone that indicates the existence of heat source activity. Then, modeling the anomaly map was used for quantitative interpretation step. The result of modeling is rock layers and geological structure model that can inform about the geothermal system. And further information from quantitative interpretations can be interpreted about lithology susceptibility. And lithology susceptibilities are andesiteas heat source has susceptibility value of (k= 0.00014 emu), basaltic as alteration rock (k= 0.0016 emu), volcanic breccia as reservoir rock (k= 0.0026 emu), andesite porfirtic as cap rock (k= 0.004 emu), lava andesite (k= 0.003 emu), and alluvium (k= 0.0007 emu). The hot spring manifestation is controlled by the normal fault which becomes a weak zone, easily passed by hot water which comes from the geothermal reservoir.Keywords: geological structure, geothermal system, magnetic, susceptibility
Procedia PDF Downloads 3842806 Automatic Lexicon Generation for Domain Specific Dataset for Mining Public Opinion on China Pakistan Economic Corridor
Authors: Tayyaba Azim, Bibi Amina
Abstract:
The increase in the popularity of opinion mining with the rapid growth in the availability of social networks has attracted a lot of opportunities for research in the various domains of Sentiment Analysis and Natural Language Processing (NLP) using Artificial Intelligence approaches. The latest trend allows the public to actively use the internet for analyzing an individual’s opinion and explore the effectiveness of published facts. The main theme of this research is to account the public opinion on the most crucial and extensively discussed development projects, China Pakistan Economic Corridor (CPEC), considered as a game changer due to its promise of bringing economic prosperity to the region. So far, to the best of our knowledge, the theme of CPEC has not been analyzed for sentiment determination through the ML approach. This research aims to demonstrate the use of ML approaches to spontaneously analyze the public sentiment on Twitter tweets particularly about CPEC. Support Vector Machine SVM is used for classification task classifying tweets into positive, negative and neutral classes. Word2vec and TF-IDF features are used with the SVM model, a comparison of the trained model on manually labelled tweets and automatically generated lexicon is performed. The contributions of this work are: Development of a sentiment analysis system for public tweets on CPEC subject, construction of an automatic generation of the lexicon of public tweets on CPEC, different themes are identified among tweets and sentiments are assigned to each theme. It is worth noting that the applications of web mining that empower e-democracy by improving political transparency and public participation in decision making via social media have not been explored and practised in Pakistan region on CPEC yet.Keywords: machine learning, natural language processing, sentiment analysis, support vector machine, Word2vec
Procedia PDF Downloads 1482805 Economic Assessment of the Fish Solar Tent Dryers
Authors: Collen Kawiya
Abstract:
In an effort of reducing post-harvest losses and improving the supply of quality fish products in Malawi, the fish solar tent dryers have been designed in the southern part of Lake Malawi for processing small fish species under the project of Cultivate Africa’s Future (CultiAF). This study was done to promote the adoption of the fish solar tent dryers by the many small scale fish processors in Malawi through the assessment of the economic viability of these dryers. With the use of the project’s baseline survey data, a business model for a constructed ‘ready for use’ solar tent dryer was developed where investment appraisal techniques were calculated in addition with the sensitivity analysis. The study also conducted a risk analysis through the use of the Monte Carlo simulation technique and a probabilistic net present value was found. The investment appraisal results showed that the net present value was US$8,756.85, the internal rate of return was 62% higher than the 16.32% cost of capital and the payback period was 1.64 years. The sensitivity analysis results showed that only two input variables influenced the fish solar dryer investment’s net present value. These are the dried fish selling prices that were correlating positively with the net present value and the fresh fish buying prices that were negatively correlating with the net present value. Risk analysis results showed that the chances that fish processors will make a loss from this type of investment are 17.56%. It was also observed that there exist only a 0.20 probability of experiencing a negative net present value from this type of investment. Lastly, the study found that the net present value of the fish solar tent dryer’s investment is still robust in spite of any changes in the levels of investors risk preferences. With these results, it is concluded that the fish solar tent dryers in Malawi are an economically viable investment because they are able to improve the returns in the fish processing activity. As such, fish processors need to adopt them by investing their money to construct and use them.Keywords: investment appraisal, risk analysis, sensitivity analysis, solar tent drying
Procedia PDF Downloads 2782804 Strength Evaluation by Finite Element Analysis of Mesoscale Concrete Models Developed from CT Scan Images of Concrete Cube
Authors: Nirjhar Dhang, S. Vinay Kumar
Abstract:
Concrete is a non-homogeneous mix of coarse aggregates, sand, cement, air-voids and interfacial transition zone (ITZ) around aggregates. Adoption of these complex structures and material properties in numerical simulation would lead us to better understanding and design of concrete. In this work, the mesoscale model of concrete has been prepared from X-ray computerized tomography (CT) image. These images are converted into computer model and numerically simulated using commercially available finite element software. The mesoscale models are simulated under the influence of compressive displacement. The effect of shape and distribution of aggregates, continuous and discrete ITZ thickness, voids, and variation of mortar strength has been investigated. The CT scan of concrete cube consists of series of two dimensional slices. Total 49 slices are obtained from a cube of 150mm and the interval of slices comes approximately 3mm. In CT scan images, the same cube can be CT scanned in a non-destructive manner and later the compression test can be carried out in a universal testing machine (UTM) for finding its strength. The image processing and extraction of mortar and aggregates from CT scan slices are performed by programming in Python. The digital colour image consists of red, green and blue (RGB) pixels. The conversion of RGB image to black and white image (BW) is carried out, and identification of mesoscale constituents is made by putting value between 0-255. The pixel matrix is created for modeling of mortar, aggregates, and ITZ. Pixels are normalized to 0-9 scale considering the relative strength. Here, zero is assigned to voids, 4-6 for mortar and 7-9 for aggregates. The value between 1-3 identifies boundary between aggregates and mortar. In the next step, triangular and quadrilateral elements for plane stress and plane strain models are generated depending on option given. Properties of materials, boundary conditions, and analysis scheme are specified in this module. The responses like displacement, stresses, and damages are evaluated by ABAQUS importing the input file. This simulation evaluates compressive strengths of 49 slices of the cube. The model is meshed with more than sixty thousand elements. The effect of shape and distribution of aggregates, inclusion of voids and variation of thickness of ITZ layer with relation to load carrying capacity, stress-strain response and strain localizations of concrete have been studied. The plane strain condition carried more load than plane stress condition due to confinement. The CT scan technique can be used to get slices from concrete cores taken from the actual structure, and the digital image processing can be used for finding the shape and contents of aggregates in concrete. This may be further compared with test results of concrete cores and can be used as an important tool for strength evaluation of concrete.Keywords: concrete, image processing, plane strain, interfacial transition zone
Procedia PDF Downloads 2392803 A Hybrid Watermarking Scheme Using Discrete and Discrete Stationary Wavelet Transformation For Color Images
Authors: Bülent Kantar, Numan Ünaldı
Abstract:
This paper presents a new method which includes robust and invisible digital watermarking on images that is colored. Colored images are used as watermark. Frequency region is used for digital watermarking. Discrete wavelet transform and discrete stationary wavelet transform are used for frequency region transformation. Low, medium and high frequency coefficients are obtained by applying the two-level discrete wavelet transform to the original image. Low frequency coefficients are obtained by applying one level discrete stationary wavelet transform separately to all frequency coefficient of the two-level discrete wavelet transformation of the original image. For every low frequency coefficient obtained from one level discrete stationary wavelet transformation, watermarks are added. Watermarks are added to all frequency coefficients of two-level discrete wavelet transform. Totally, four watermarks are added to original image. In order to get back the watermark, the original and watermarked images are applied with two-level discrete wavelet transform and one level discrete stationary wavelet transform. The watermark is obtained from difference of the discrete stationary wavelet transform of the low frequency coefficients. A total of four watermarks are obtained from all frequency of two-level discrete wavelet transform. Obtained watermark results are compared with real watermark results, and a similarity result is obtained. A watermark is obtained from the highest similarity values. Proposed methods of watermarking are tested against attacks of the geometric and image processing. The results show that proposed watermarking method is robust and invisible. All features of frequencies of two level discrete wavelet transform watermarking are combined to get back the watermark from the watermarked image. Watermarks have been added to the image by converting the binary image. These operations provide us with better results in getting back the watermark from watermarked image by attacking of the geometric and image processing.Keywords: watermarking, DWT, DSWT, copy right protection, RGB
Procedia PDF Downloads 5352802 Nano-Enhanced In-Situ and Field Up-Gradation of Heavy Oil
Authors: Devesh Motwani, Ranjana S. Baruah
Abstract:
The prime incentive behind up gradation of heavy oil is to increase its API gravity for ease of transportation to refineries, thus expanding the market access of bitumen-based crude to the refineries. There has always been a demand for an integrated approach that aims at simplifying the upgrading scheme, making it adaptable to the production site in terms of economics, environment, and personnel safety. Recent advances in nanotechnology have facilitated the development of two lines of heavy oil upgrading processes that make use of nano-catalysts for producing upgraded oil: In Situ Upgrading and Field Upgrading. The In-Situ upgrading scheme makes use of Hot Fluid Injection (HFI) technique where heavy fractions separated from produced oil are injected into the formations to reintroduce heat into the reservoir along with suspended nano-catalysts and hydrogen. In the presence of hydrogen, catalytic exothermic hydro-processing reactions occur that produce light gases and volatile hydrocarbons which contribute to increased oil detachment from the rock resulting in enhanced recovery. In this way the process is a combination of enhanced heavy oil recovery along with up gradation that effectively handles the heat load within the reservoirs, reduces hydrocarbon waste generation and minimizes the need for diluents. By eliminating most of the residual oil, the Synthetic Crude Oil (SCO) is much easier to transport and more amenable for processing in refineries. For heavy oil reservoirs seriously impacted by the presence of aquifers, the nano-catalytic technology can still be implemented on field though with some additional investments and reduced synergies; however still significantly serving the purpose of production of transportable oil with substantial benefits with respect to both large scale upgrading, and known commercial field upgrading technologies currently on the market. The paper aims to delve deeper into the technology discussed, and the future compatibility.Keywords: upgrading, synthetic crude oil, nano-catalytic technology, compatibility
Procedia PDF Downloads 4082801 Teaching English as a Foreign Language: Insights from the Philippine Context
Authors: Arlene Villarama, Micol Grace Guanzon, Zenaida Ramos
Abstract:
This paper provides insights into teaching English as a Foreign Language in the Philippines. The authors reviewed relevant theories and literature, and provide an analysis of the issues in teaching English in the Philippine setting in the light of these theories. The authors made an investigation in Bagong Barrio National High School (BBNHS) - a public school in Caloocan City. The institution has a population of nearly 3,000 students. The performances of randomly chosen 365 respondents were scrutinised. The study regarding the success of teaching English as a foreign language to Filipino children were highlighted. This includes the respondents’ family background, surroundings, way of living, and their behavior and understanding regarding education. The results show that there is a significant relationship between demonstrative, communal, and logical areas that touch the efficacy of introducing English as a foreign Dialectal. Filipino children, by nature, are adventurous and naturally joyful even for little things. They are born with natural skills and capabilities to discover new things. They highly consider activities and work that ignite their curiosity. They love to be recognised and are inspired the most when given the assurance of acceptance and belongingness. Fun is the appealing influence to ignite and motivate learning. The magic word is excitement. The study reveals the many facets of the accumulation and transmission of erudition, in introduction and administration of English as a foreign phonological; it runs and passes through different channels of diffusion. Along the way, there are particles that act as obstructions in protocols where knowledge are to be gathered. Data gained from the respondents conceals a reality that is beyond one’s imagination. One significant factor that touches the inefficacy of understanding and using English as a foreign language is an erroneous outset gained from an old belief handed down from generation to generation. This accepted perception about the power and influence of the use of language, gives the novices either a negative or a positive notion. The investigation shows that a higher number of dislikes in the use of English can be tracked down from the belief of the story on how the English language came into existence. The belief that only the great and the influential have the right to use English as a means of communication kills the joy of acceptance. A significant notation has to be examined so as to provide a solution or if not eradicate the misconceptions that lie behind the substance of the matter. The result of the authors’ research depicts a substantial correlation between the emotional (demonstrative), social (communal), and intellectual (logical). The focus of this paper is to bring out the right notation and disclose the misconceptions with regards to teaching English as a foreign language. This will concentrate on the emotional, social, and intellectual areas of the Filipino learners and how these areas affect the transmittance and accumulation of learning. The authors’ aim is to formulate logical ways and techniques that would open up new beginnings in understanding and acceptance of the subject matter.Keywords: accumulation, behaviour, facets, misconceptions, transmittance
Procedia PDF Downloads 2042800 Valorisation of Mango Seed: Response Surface Methodology Based Optimization of Starch Extraction from Mango Seeds
Authors: Tamrat Tesfaye, Bruce Sithole
Abstract:
Box-Behnken Response surface methodology was used to determine the optimum processing conditions that give maximum extraction yield and whiteness index from mango seed. The steeping time ranges from 2 to 12 hours and slurring of the steeped seed in sodium metabisulphite solution (0.1 to 0.5 w/v) was carried out. Experiments were designed according to Box-Behnken Design with these three factors and a total of 15 runs experimental variables of were analyzed. At linear level, the concentration of sodium metabisulphite had significant positive influence on percentage yield and whiteness index at p<0.05. At quadratic level, sodium metabisulphite concentration and sodium metabisulphite concentration2 had a significant negative influence on starch yield; sodium metabisulphite concentration and steeping time*temperature had significant (p<0.05) positive influence on whiteness index. The adjusted R2 above 0.8 for starch yield (0.906465) and whiteness index (0.909268) showed a good fit of the model with the experimental data. The optimum sodium metabisulphite concentration, steeping hours, and temperature for starch isolation with maximum starch yield (66.428%) and whiteness index (85%) as set goals for optimization with the desirability of 0.91939 was 0.255w/v concentration, 2hrs and 50 °C respectively. The determined experimental value of each response based on optimal condition was statistically in accordance with predicted levels at p<0.05. The Mango seeds are the by-products obtained during mango processing and possess disposal problem if not handled properly. The substitution of food based sizing agents with mango seed starch can contribute as pertinent resource deployment for value-added product manufacturing and waste utilization which might play significance role of food security in Ethiopia.Keywords: mango, synthetic sizing agent, starch, extraction, textile, sizing
Procedia PDF Downloads 2312799 Boron Nitride Nanoparticle Enhanced Prepreg Composite Laminates
Authors: Qiong Tian, Lifeng Zhang, Demei Yu, Ajit D. Kelkar
Abstract:
Low specific weight and high strength is the basic requirement for aerospace materials. Fiber-reinforced epoxy resin composites are attractive materials for this purpose. Boron nitride nanoparticles (BNNPs) have good radiation shielding capacity, which is very important to aerospace materials. Herein a processing route for an advanced hybrid composite material is demonstrated by introducing dispersed BNNPs in standard prepreg manufacturing. The hybrid materials contain three parts: E-fiberglass, an aerospace-grade epoxy resin system, and BNNPs. A vacuum assisted resin transfer molding (VARTM) was utilized in this processing. Two BNNP functionalization approaches are presented in this study: (a) covalent functionalization with 3-aminopropyltriethoxysilane (KH-550); (b) non-covalent functionalization with cetyltrimethylammonium bromide (CTAB). The functionalized BNNPs were characterized by Fourier-transform infrared spectroscopy (FT-IR), X-ray diffraction(XRD) and scanning electron microscope (SEM). The results showed that BN powder was successfully functionalized via the covalent and non-covalent approaches without any crystal structure change and big agglomerate particles were broken into platelet-like nanoparticles (BNNPs) after functionalization. Compared to pristine BN powder, surface modified BNNPs could result in significant improvement in mechanical properties such as tensile, flexural and compressive strength and modulus. CTAB functionalized BNNPs (CTAB-BNNPs) showed higher tensile and flexural strength but lower compressive strength than KH-550 functionalized BNNPs (KH550-BNNPs). These reinforcements are mainly attributed to good BNNPs dispersion and interfacial adhesion between epoxy matrix and BNNPs. This study reveals the potential in improving mechanical properties of BNNPs-containing composites laminates through surface functionalization of BNNPs.Keywords: boron nitride, epoxy, functionalization, prepreg, composite
Procedia PDF Downloads 4342798 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: image processing, illumination equalization, shadow filtering, object detection
Procedia PDF Downloads 2162797 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry
Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine
Abstract:
The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).Keywords: bottom elevation, MVS, river, SfM
Procedia PDF Downloads 2992796 A Crystallization Kinetic Model for Long Fiber-Based Composite with Thermoplastic Semicrystalline Polymer Matrix
Authors: Nicolas Bigot, M'hamed Boutaous, Nahiene Hamila, Shihe Xin
Abstract:
Composite materials with polymer matrices are widely used in most industrial areas, particularly in aeronautical and automotive ones. Thanks to the development of a high-performance thermoplastic semicrystalline polymer matrix, those materials exhibit more and more efficient properties. The polymer matrix in composite materials can manifest a specific crystalline structure characteristic of crystallization in a fibrous medium. In order to guarantee a good mechanical behavior of structures and to optimize their performances, it is necessary to define realistic mechanical constitutive laws of such materials considering their physical structure. The interaction between fibers and matrix is a key factor in the mechanical behavior of composite materials. Transcrystallization phenomena which develops in the matrix around the fibers constitute the interphase which greatly affects and governs the nature of the fiber-matrix interaction. Hence, it becomes fundamental to quantify its impact on the thermo-mechanical behavior of composites material in relationship with processing conditions. In this work, we propose a numerical model coupling the thermal and crystallization kinetics in long fiber-based composite materials, considering both the spherulitic and transcrystalline types of the induced structures. After validation of the model with comparison to results from the literature and noticing a good correlation, a parametric study has been led on the effects of the thermal kinetics, the fibers volume fractions, the deformation, and the pressure on the crystallization rate in the material, under processing conditions. The ratio of the transcrystallinity is highlighted and analyzed with regard to the thermal kinetics and gradients in the material. Experimental results on the process are foreseen and pave the way to establish a mechanical constitutive law describing, with the introduction of the role on the crystallization rates and types on the thermo-mechanical behavior of composites materials.Keywords: composite materials, crystallization, heat transfer, modeling, transcrystallization
Procedia PDF Downloads 1932795 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories
Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos
Abstract:
Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.Keywords: database, forensic genetics, genetic analysis, sample management, software solution
Procedia PDF Downloads 3702794 Cognitive Deficits and Association with Autism Spectrum Disorder and Attention Deficit Hyperactivity Disorder in 22q11.2 Deletion Syndrome
Authors: Sinead Morrison, Ann Swillen, Therese Van Amelsvoort, Samuel Chawner, Elfi Vergaelen, Michael Owen, Marianne Van Den Bree
Abstract:
22q11.2 Deletion Syndrome (22q11.2DS) is caused by the deletion of approximately 60 genes on chromosome 22 and is associated with high rates of neurodevelopmental disorders such as Attention Deficit Hyperactivity Disorder (ADHD) and Autism Spectrum Disorders (ASD). The presentation of these disorders in 22q11.2DS is reported to be comparable to idiopathic forms and therefore presents a valuable model for understanding mechanisms of neurodevelopmental disorders. Cognitive deficits are thought to be a core feature of neurodevelopmental disorders, and possibly manifest in behavioural and emotional problems. There have been mixed findings in 22q11.2DS on whether the presence of ADHD or ASD is associated with greater cognitive deficits. Furthermore, the influence of developmental stage has never been taken into account. The aim was therefore to examine whether the presence of ADHD or ASD was associated with cognitive deficits in childhood and/or adolescence in 22q11.2DS. We conducted the largest study to date of this kind in 22q11.2DS. The same battery of tasks measuring processing speed, attention and spatial working memory were completed by 135 participants with 22q11.2DS. Wechsler IQ tests were completed, yielding Full Scale (FSIQ), Verbal (VIQ) and Performance IQ (PIQ). Age-standardised difference scores were produced for each participant. Developmental stages were defined as children (6-10 years) and adolescents (10-18 years). ADHD diagnosis was ascertained from a semi-structured interview with a parent. ASD status was ascertained from a questionnaire completed by a parent. Interaction and main effects of cognitive performance of those with or without a diagnosis of ADHD or ASD in childhood or adolescence were conducted with 2x2 ANOVA. Significant interactions were followed up with t-tests of simple effects. Adolescents with ASD displayed greater deficits in all measures (processing speed, p = 0.022; sustained attention, p = 0.016; working memory, p = 0.006) than adolescents without ASD; there was no difference between children with and without ASD. There were no significant differences on IQ measures. Both children and adolescents with ADHD displayed greater deficits on sustained attention (p = 0.002) than those without ADHD. There were no significant differences on any other measures for ADHD. Magnitude of cognitive deficit in individuals with 22q11.2DS varied by cognitive domain, developmental stage and presence of neurodevelopmental disorder. Adolescents with 22q11.2DS and ASD showed greater deficits on all measures, which suggests there may be a sensitive period in childhood to acquire these domains, or reflect increasing social and academic demands in adolescence. The finding of poorer sustained attention in children and adolescents with ADHD supports previous research and suggests a specific deficit which can be separated from processing speed and working memory. This research provides unique insights into the association of ASD and ADHD with cognitive deficits in a group at high genomic risk of neurodevelopmental disorders.Keywords: 22q11.2 deletion syndrome, attention deficit hyperactivity disorder, autism spectrum disorder, cognitive development
Procedia PDF Downloads 1512793 Integrated Life Skill Training and Executive Function Strategies in Children with Autism Spectrum Disorder in Qatar: A Study Protocol for a Randomized Controlled Trial
Authors: Bara M Yousef, Naresh B Raj, Nadiah W Arfah, Brightlin N Dhas
Abstract:
Background: Executive function (EF) impairment is common in children with autism spectrum disorder (ASD). EF strategies are considered effective in improving the therapeutic outcomes of children with ASD. Aims: This study primarily aims to explore whether integrating EF strategies combined with regular occupational therapy intervention is more effective in improving daily life skills (DLS) and sensory integration/processing (SI/SP) skills than regular occupational therapy alone in children with ASD and secondarily aims to assess treatment outcomes on improving visual motor integration (VMI) skills. Procedures: A total of 92 children with ASD will be recruited and, following baseline assessments, randomly assigned to the treatment group (45-min once weekly individual occupational therapy plus EF strategies) and control group (45-min once weekly individual therapy sessions alone). Results and Outcomes: All children will be evaluated systematically by assessing SI/SP, DLS, and VMI, skills at baseline, 7 weeks, and 14 weeks of treatment. Data will be analyzed using ANCOVA and T-test. Conclusions and Implications: This single-blind, randomized controlled trial will provide empirical evidence for the effectiveness of EF strategies when combined with regular occupational therapy programs. Based on trial results, EF strategies could be recommended in multidisciplinary programs for children with ASD. Trial Registration: The trial has been registered in the clinicaltrail.gov for a registry, protocol ID: MRC-01-22-509 ClinicalTrials.gov Identifier: NCT05829577, registered 25th April 2023Keywords: autism spectrum disorder, executive function strategies, daily life skills, sensory integration/processing, visual motor integration, occupational therapy, effectiveness
Procedia PDF Downloads 1222792 Entrepreneurial Orientation and Business Performance: The Case of Micro Scale Food Processors Operating in a War-Recovery Environment
Authors: V. Suganya, V. Balasuriya
Abstract:
The functioning of Micro and Small Scale (MSS) businesses in the northern part of Sri Lanka was vulnerable due to three decades of internal conflict and the subsequent post-war economic openings has resulted new market prospects for MSS businesses. MSS businesses survive and operate with limited resources and struggle to access finance, raw material, markets, and technology. This study attempts to identify the manner in which entrepreneurial orientation puts into practice by the business operators to overcome these business challenges. Business operators in the traditional food processing sector are taken for this study as this sub-sector of the food industry is developing at a rapid pace. A review of the literature was done to recognize the concepts of entrepreneurial orientation, defining MMS businesses and the manner in which business performance is measured. Direct interview method supported by a structured questionnaire is used to collect data from 80 respondents; based on a fixed interval random sampling technique. This study reveals that more than half of the business operators have opted to commence their business ventures as a result of identifying a market opportunity. 41 per cent of the business operators are highly entrepreneurial oriented in a scale of 1 to 5. Entrepreneurial orientation shows significant relationship and strongly correlated with business performance. Pro-activeness, innovativeness and competitive aggressiveness shows a significant relationship with business performance while risk taking is negative and autonomy is not significantly related to business performance. It is evident that entrepreneurial oriented business practices contribute to better business performance even though 70 per cent prefer the ideas/views of the support agencies than the stakeholders when making business decisions. It is recommended that appropriate training should be introduced to develop entrepreneurial skills focusing to improve business networks so that new business opportunities and innovative business practices are identified.Keywords: Micro and Small Scale (MMS) businesses, entrepreneurial orientation (EO), food processing, business operators
Procedia PDF Downloads 4952791 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure
Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer
Abstract:
The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition
Procedia PDF Downloads 108