Search results for: automatic processing
3705 Effect of Plasma Treatment on UV Protection Properties of Fabrics
Authors: Sheila Shahidi
Abstract:
UV protection by fabrics has recently become a focus of great interest, particularly in connection with environmental degradation or ozone layer depletion. Fabrics provide simple and convenient protection against UV radiation (UVR), but not all fabrics offer sufficient UV protection. To describe the degree of UVR protection offered by clothing materials, the ultraviolet protection factor (UPF) is commonly used. UV-protective fabric can be generated by application of a chemical finish using normal wet-processing methodologies. However, traditional wet-processing techniques are known to consume large quantities of water and energy and may lead to adverse alterations of the bulk properties of the substrate. Recently, usage of plasmas to generate physicochemical surface modifications of textile substrates has become an intriguing approach to replace or enhance conventional wet-processing techniques. In this research work the effect of plasma treatment on UV protection properties of fabrics was investigated. DC magnetron sputtering was used and the parameters of plasma such as gas type, electrodes, time of exposure, power and, etc. were studied. The morphological and chemical properties of samples were analyzed using Scanning Electron Microscope (SEM) and Furrier Transform Infrared Spectroscopy (FTIR), respectively. The transmittance and UPF values of the original and plasma-treated samples were measured using a Shimadzu UV3101 PC (UV–Vis–NIR scanning spectrophotometer, 190–2, 100 nm range). It was concluded that, plasma which is an echo-friendly, cost effective and dry technique is being used in different branches of the industries, and will conquer textile industry in the near future. Also it is promising method for preparation of UV protection textile.Keywords: fabric, plasma, textile, UV protection
Procedia PDF Downloads 5193704 High Motivational Salient Face Distractors Slowed Target Detection: Evidence from Behavioral Studies
Authors: Rashmi Gupta
Abstract:
Rewarding stimuli capture attention involuntarily as a result of an association process that develops quickly during value learning, referred to as the reward or value-driven attentional capture. It is essential to compare reward with punishment processing to get a full picture of value-based modulation in visual attention processing. Hence, the present study manipulated both valence/value (reward as well as punishment) and motivational salience (probability of an outcome: high vs. low) together. Series of experiments were conducted, and there were two phases in each experiment. In phase 1, participants were required to learn to associate specific face stimuli with a high or low probability of winning or losing points. In the second phase, these conditioned stimuli then served as a distractor or prime in a speeded letter search task. Faces with high versus low outcome probability, regardless of valence, slowed the search for targets (specifically the left visual field target) and suggesting that the costs to performance on non-emotional cognitive tasks were only driven by motivational salience (high vs. loss) associated with the stimuli rather than the valence (gain vs. loss). It also suggests that the processing of motivationally salient stimuli is right-hemisphere biased. Together, results of these studies strengthen the notion that our visual attention system is more sensitive to affected by motivational saliency rather than valence, which termed here as motivational-driven attentional capture.Keywords: attention, distractors, motivational salience, valence
Procedia PDF Downloads 2203703 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 673702 Study on Roll Marks of Stainless Steel in Rolling Mill
Authors: Cai-Wan Chang-Jian, Han-Ting Tsai
Abstract:
In the processing industry of metal forming, rolling is the most used method of processing. In a cold rolling factory of stainless steel, there occurs a product defect on temper rolling process within cold rolling. It is called 'roll marks', which is a phenomenon of undesirable flatness problem. In this research, we performed a series of experimental measurements on the roll marks, and we used optical sensors to measure it and compared the vibration frequency of roll marks with the vibration frequency of key components in the skin pass mill. We found there is less correlation between the above mentioned data. Finally, we took measurement on the motor driver in rolling mill. We found that the undulation frequency of motor could match with the frequency of roll marks, and then we have confirmed that the motor’s undulation caused roll marks.Keywords: roll mark, plane strain, rolling mill, stainless steel
Procedia PDF Downloads 4543701 Embedded Electrochemistry with Miniaturized, Drone-Based, Potentiostat System for Remote Detection Chemical Warfare Agents
Authors: Amer Dawoud, Jesy Motchaalangaram, Arati Biswakarma, Wujan Mio, Karl Wallace
Abstract:
The development of an embedded miniaturized drone-based system for remote detection of Chemical Warfare Agents (CWA) is proposed. The paper focuses on the software/hardware system design of the electrochemical Cyclic Voltammetry (CV) and Differential Pulse Voltammetry (DPV) signal processing for future deployment on drones. The paper summarizes the progress made towards hardware and electrochemical signal processing for signature detection of CWA. Also, the miniature potentiostat signal is validated by comparing it with the high-end lab potentiostat signal.Keywords: drone-based, remote detection chemical warfare agents, miniaturized, potentiostat
Procedia PDF Downloads 1363700 Assessment of an ICA-Based Method for Detecting the Effect of Attention in the Auditory Late Response
Authors: Siavash Mirahmadizoghi, Steven Bell, David Simpson
Abstract:
In this work a new independent component analysis (ICA) based method for noise reduction in evoked potentials is evaluated on for auditory late responses (ALR) captured with a 63-channel electroencephalogram (EEG) from 10 normal-hearing subjects. The performance of the new method is compared with a single channel alternative in terms of signal to noise ratio (SNR), the number of channels with an SNR above an empirically derived statistical critical value and an estimate of the effect of attention on the major components in the ALR waveform. The results show that the multichannel signal processing method can significantly enhance the quality of the ALR signal and also detect the effect of the attention on the ALR better than the single channel alternative.Keywords: auditory late response (ALR), attention, EEG, independent component analysis (ICA), multichannel signal processing
Procedia PDF Downloads 5053699 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems
Authors: Yong-Kyu Jung
Abstract:
The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity
Procedia PDF Downloads 783698 Ecological Risk Assessment of Informal E-Waste Processing in Alaba International Market, Lagos, Nigeria
Authors: A. A. Adebayo, O. Osibanjo
Abstract:
Informal electronic waste (e-waste) processing is a crude method of recycling, which is on the increase in Nigeria. The release of hazardous substances such as heavy metals (HMs) into the environment during informal e-waste processing has been a major concern. However, there is insufficient information on environmental contamination from e-waste recycling, associated ecological risk in Alaba International Market, a major electronic market in Lagos, Nigeria. The aims of this study were to determine the levels of HMs in soil, resulting from the e-waste recycling; and also assess associated ecological risks in Alaba international market. Samples of soils (334) were randomly collected seasonally for three years from fourteen selected e-waste activity points and two control sites. The samples were digested using standard methods and HMs analysed by inductive coupled plasma optical emission. Ecological risk was estimated using Ecological Risk index (ER), Potential Ecological Risk index (RI), Index of geoaccumulation (Igeo), Contamination factor (Cf) and degree of contamination factor (Cdeg). The concentrations range of HMs (mg/kg) in soil were: 16.7-11200.0 (Pb); 14.3-22600.0 (Cu); 1.90-6280.0 (Ni), 39.5-4570.0 (Zn); 0.79-12300.0 (Sn); 0.02-138.0 (Cd); 12.7-1710.0 (Ba); 0.18-131.0 (Cr); 0.07-28.0 (V), while As was below detection limit. Concentrations range in control soils were 1.36-9.70 (Pb), 2.06-7.60 (Cu), 1.25-5.11 (Ni), 3.62-15.9 (Zn), BDL-0.56 (Sn), BDL-0.01 (Cd), 14.6-47.6 (Ba), 0.21–12.2 (Cr) and 0.22-22.2 (V). The trend in ecological risk index was in the order Cu > Pb > Ni > Zn > Cr > Cd > Ba > V. The potential ecological risk index with respect to informal e-waste activities were: burning > dismantling > disposal > stockpiling. The index of geo accumulation indices revealed that soils were extremely polluted with Cd, Cu, Pb, Zn and Ni. The contamination factor indicated that 93% of the studied areas have very high contamination status for Pb, Cu, Ba, Sn and Co while Cr and Cd were in the moderately contaminated status. The degree of contamination decreased in the order of Sn > Cu > Pb >> Zn > Ba > Co > Ni > V > Cr > Cd. Heavy metal contamination of Alaba international market environment resulting from informal e-waste processing was established. Proper management of e-waste and remediation of the market environment are recommended to minimize the ecological risks.Keywords: Alaba international market, ecological risk, electronic waste, heavy metal contamination
Procedia PDF Downloads 1983697 Optimization of the Feedstock Supply of an Oilseeds Conversion Unit for Biofuel Production in West Africa: A Comparative Study of the Supply of Jatropha curcas and Balanites aegyptiaca Seeds
Authors: Linda D. F. Bambara, Marie Sawadogo
Abstract:
Jatropha curcas (jatropha) is the plant that has been the most studied for biofuel production in West Africa. There exist however other plants such as Balanites aegyptiaca (balanites) that have been targeted as a potential feedstock for biofuel production. This biomass could be an alternative feedstock for the production of straight vegetable oil (SVO) at costs lower than jatropha-based SVO production costs. This study aims firstly to determine, through an MILP model, the optimal organization that minimizes the costs of the oilseeds supply of two biomass conversion units (BCU) exploiting respectively jatropha seeds and the balanitès seeds. Secondly, the study aims to carry out a comparative study of these costs obtained for each BCU. The model was then implemented on two theoretical cases studies built on the basis of the common practices in Burkina Faso and two scenarios were carried out for each case study. In Scenario 1, 3 pre-processing locations ("at the harvesting area", "at the gathering points", "at the BCU") are possible. In scenario 2, only one location ("at the BCU") is possible. For each biomass, the system studied is the upstream supply chain (harvesting, transport and pre-processing (drying, dehulling, depulping)), including cultivation (for jatropha). The model optimizes the area of land to be exploited based on the productivity of the studied plants and material losses that may occur during the harvesting and the supply of the BCU. It then defines the configuration of the logistics network allowing an optimal supply of the BCU taking into account the most common means of transport in West African rural areas. For the two scenarios, the results of the implementation showed that the total area exploited for balanites (1807 ha) is 4.7 times greater than the total area exploited for Jatropha (381 ha). In both case studies, the location of pre-processing “at the harvesting area” was always chosen for scenario1. As the balanites trees were not planted and because the first harvest of the jatropha seeds took place 4 years after planting, the cost price of the seeds at the BCU without the pre-processing costs was about 430 XOF/kg. This cost is 3 times higher than the balanites's one, which is 140 XOF/kg. After the first year of harvest, i.e. 5 years after planting, and assuming that the yield remains constant, the same cost price is about 200 XOF/kg for Jatropha. This cost is still 1.4 times greater than the balanites's one. The transport cost of the balanites seeds is about 120 XOF/kg. This cost is similar for the jatropha seeds. However, when the pre-processing is located at the BCU, i.e. for scenario2, the transport costs of the balanites seeds is 1200 XOF/kg. These costs are 6 times greater than the transport costs of jatropha which is 200 XOF/kg. These results show that the cost price of the balanites seeds at the BCU can be competitive compared to the jatropha's one if the pre-processing is located at the harvesting area.Keywords: Balanites aegyptiaca, biomass conversion, Jatropha curcas, optimization, post-harvest operations
Procedia PDF Downloads 3383696 Tank Barrel Surface Damage Detection Algorithm
Authors: Tomáš Dyk, Stanislav Procházka, Martin Drahanský
Abstract:
The article proposes a new algorithm for detecting damaged areas of the tank barrel based on the image of the inner surface of the tank barrel. Damage position is calculated using image processing techniques such as edge detection, discrete wavelet transformation and image segmentation for accurate contour detection. The algorithm can detect surface damage in smoothbore and even in rifled tank barrels. The algorithm also calculates the volume of the detected damage from the depth map generated, for example, from the distance measurement unit. The proposed method was tested on data obtained by a tank barrel scanning device, which generates both surface image data and depth map. The article also discusses tank barrel scanning devices and how damaged surface impacts material resistance.Keywords: barrel, barrel diagnostic, image processing, surface damage detection, tank
Procedia PDF Downloads 1373695 Examining Predictive Coding in the Hierarchy of Visual Perception in the Autism Spectrum Using Fast Periodic Visual Stimulation
Authors: Min L. Stewart, Patrick Johnston
Abstract:
Predictive coding has been proposed as a general explanatory framework for understanding the neural mechanisms of perception. As such, an underweighting of perceptual priors has been hypothesised to underpin a range of differences in inferential and sensory processing in autism spectrum disorders. However, empirical evidence to support this has not been well established. The present study uses an electroencephalography paradigm involving changes of facial identity and person category (actors etc.) to explore how levels of autistic traits (AT) affect predictive coding at multiple stages in the visual processing hierarchy. The study uses a rapid serial presentation of faces, with hierarchically structured sequences involving both periodic and aperiodic repetitions of different stimulus attributes (i.e., person identity and person category) in order to induce contextual expectations relating to these attributes. It investigates two main predictions: (1) significantly larger and late neural responses to change of expected visual sequences in high-relative to low-AT, and (2) significantly reduced neural responses to violations of contextually induced expectation in high- relative to low-AT. Preliminary frequency analysis data comparing high and low-AT show greater and later event-related-potentials (ERPs) in occipitotemporal areas and prefrontal areas in high-AT than in low-AT for periodic changes of facial identity and person category but smaller ERPs over the same areas in response to aperiodic changes of identity and category. The research advances our understanding of how abnormalities in predictive coding might underpin aberrant perceptual experience in autism spectrum. This is the first stage of a research project that will inform clinical practitioners in developing better diagnostic tests and interventions for people with autism.Keywords: hierarchical visual processing, face processing, perceptual hierarchy, prediction error, predictive coding
Procedia PDF Downloads 1113694 Adaptive CFAR Analysis for Non-Gaussian Distribution
Authors: Bouchemha Amel, Chachoui Takieddine, H. Maalem
Abstract:
Automatic detection of targets in a modern communication system RADAR is based primarily on the concept of adaptive CFAR detector. To have an effective detection, we must minimize the influence of disturbances due to the clutter. The detection algorithm adapts the CFAR detection threshold which is proportional to the average power of the clutter, maintaining a constant probability of false alarm. In this article, we analyze the performance of two variants of adaptive algorithms CA-CFAR and OS-CFAR and we compare the thresholds of these detectors in the marine environment (no-Gaussian) with a Weibull distribution.Keywords: CFAR, threshold, clutter, distribution, Weibull, detection
Procedia PDF Downloads 5893693 Comparative Study of Skeletonization and Radial Distance Methods for Automated Finger Enumeration
Authors: Mohammad Hossain Mohammadi, Saif Al Ameri, Sana Ziaei, Jinane Mounsef
Abstract:
Automated enumeration of the number of hand fingers is widely used in several motion gaming and distance control applications, and is discussed in several published papers as a starting block for hand recognition systems. The automated finger enumeration technique should not only be accurate, but also must have a fast response for a moving-picture input. The high performance of video in motion games or distance control will inhibit the program’s overall speed, for image processing software such as Matlab need to produce results at high computation speeds. Since an automated finger enumeration with minimum error and processing time is desired, a comparative study between two finger enumeration techniques is presented and analyzed in this paper. In the pre-processing stage, various image processing functions were applied on a real-time video input to obtain the final cleaned auto-cropped image of the hand to be used for the two techniques. The first technique uses the known morphological tool of skeletonization to count the number of skeleton’s endpoints for fingers. The second technique uses a radial distance method to enumerate the number of fingers in order to obtain a one dimensional hand representation. For both discussed methods, the different steps of the algorithms are explained. Then, a comparative study analyzes the accuracy and speed of both techniques. Through experimental testing in different background conditions, it was observed that the radial distance method was more accurate and responsive to a real-time video input compared to the skeletonization method. All test results were generated in Matlab and were based on displaying a human hand for three different orientations on top of a plain color background. Finally, the limitations surrounding the enumeration techniques are presented.Keywords: comparative study, hand recognition, fingertip detection, skeletonization, radial distance, Matlab
Procedia PDF Downloads 3823692 Resource Framework Descriptors for Interestingness in Data
Authors: C. B. Abhilash, Kavi Mahesh
Abstract:
Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.Keywords: RDF, interestingness, knowledge base, semantic data
Procedia PDF Downloads 1623691 Influence of Thermal Processing Methods on Antinutrient of Artocarpus heterophyllus Seeds
Authors: Marina Zulkifli, Mohd Faizal Mashhod, Noriham Abdullah
Abstract:
The aim of this study was to determine the antinutrient compounds of jackfruit (Artocarpus heterophyllus) seeds as affected by thermal processes. Two types of heat treatments were applied namely boiling and microwave cooking. Results of this study showed that boiling caused a significant decrease in phytate content (30.01%), oxalate content (33.22%), saponin content (35.69%) and tannin content (44.58%) as compared to microwave cooking and raw seed. The percentage loss of antinutrient compounds in microwaved seed was: phytate 24.58%, oxalate 27.28%, saponin 16.50% and tannin 32.21%. Hence, these findings suggested that boiling is an effective treatment to reduce the level of toxic compounds in foods.Keywords: jackfruit, heat treatments, antinutrient compounds, thermal processing
Procedia PDF Downloads 4333690 A Fuzzy Mathematical Model for Order Acceptance and Scheduling Problem
Authors: E. Koyuncu
Abstract:
The problem of Order Acceptance and Scheduling (OAS) is defined as a joint decision of which orders to accept for processing and how to schedule them. Any linear programming model representing real-world situation involves the parameters defined by the decision maker in an uncertain way or by means of language statement. Fuzzy data can be used to incorporate vagueness in the real-life situation. In this study, a fuzzy mathematical model is proposed for a single machine OAS problem, where the orders are defined by their fuzzy due dates, fuzzy processing times, and fuzzy sequence dependent setup times. The signed distance method, one of the fuzzy ranking methods, is used to handle the fuzzy constraints in the model.Keywords: fuzzy mathematical programming, fuzzy ranking, order acceptance, single machine scheduling
Procedia PDF Downloads 3383689 Scar Removal Stretegy for Fingerprint Using Diffusion
Authors: Mohammad A. U. Khan, Tariq M. Khan, Yinan Kong
Abstract:
Fingerprint image enhancement is one of the most important step in an automatic fingerprint identification recognition (AFIS) system which directly affects the overall efficiency of AFIS. The conventional fingerprint enhancement like Gabor and Anisotropic filters do fill the gaps in ridge lines but they fail to tackle scar lines. To deal with this problem we are proposing a method for enhancing the ridges and valleys with scar so that true minutia points can be extracted with accuracy. Our results have shown an improved performance in terms of enhancement.Keywords: fingerprint image enhancement, removing noise, coherence, enhanced diffusion
Procedia PDF Downloads 5163688 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method
Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter
Abstract:
This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.Keywords: aging, eye tracking, implicit learning, visual statistical learning
Procedia PDF Downloads 773687 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova
Abstract:
The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.Keywords: bacteriocins, cross-contamination, mathematical model, temperature
Procedia PDF Downloads 1443686 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine
Authors: D. Madhushanka, Y. Liu, H. C. Fernando
Abstract:
Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2
Procedia PDF Downloads 2353685 Impact of Varying Malting and Fermentation Durations on Specific Chemical, Functional Properties, and Microstructural Behaviour of Pearl Millet and Sorghum Flour Using Response Surface Methodology
Authors: G. Olamiti; TK. Takalani; D. Beswa, AIO Jideani
Abstract:
The study investigated the effects of malting and fermentation times on some chemical, functional properties and microstructural behaviour of Agrigreen, Babala pearl millet cultivars and sorghum flours using response surface methodology (RSM). Central Composite Rotatable Design (CCRD) was performed on two independent variables: malting and fermentation times (h), at intervals of 24, 48, and 72, respectively. The results of dependent parameters such as pH, titratable acidity (TTA), Water absorption capacity (WAC), Oil absorption capacity (OAC), bulk density (BD), dispersibility and microstructural behaviour of the flours studied showed a significant difference in p < 0.05 upon malting and fermentation time. Babala flour exhibited a higher pH value at 4.78 at 48 h malted and 81.9 fermentation times. Agrigreen flour showed a higher TTA value at 0.159% at 81.94 h malted and 48 h fermentation times. WAC content was also higher in malted and fermented Babala flour at 2.37 ml g-1 for 81.94 h malted and 48 h fermentation time. Sorghum flour exhibited the least OAC content at 1.67 ml g-1 at 14 h malted and 48 h fermentation times. Agrigreen flour recorded the least bulk density, at 0.53 g ml-1 for 72 h malted and 24 h fermentation time. Sorghum flour exhibited a higher content of dispersibility, at 56.34%, after 24 h malted and 72 h fermented time. The response surface plots showed that increased malting and fermentation time influenced the dependent parameters. The microstructure behaviour of malting and fermentation times of pearl millet varieties and sorghum flours showed isolated, oval, spherical, or polygonal to smooth surfaces. The optimal processing conditions, such as malting and fermentation time for Agrigreen, were 32.24 h and 63.32 h; 35.18 h and 34.58 h for Babala; and 36.75 h and 47.88 h for sorghum with high desirability of 1.00. The validation of the optimum processing malting and fermentation times (h) on the dependent improved the experimented values. Food processing companies can use the study's findings to improve food processing and quality.Keywords: Pearl millet, malting, fermentation, microstructural behaviour
Procedia PDF Downloads 713684 Pale, Soft, Exudative (PSE) Turkey Meat in a Brazilian Commercial Processing Plant
Authors: Danielle C. B. Honorato, Rafael H. Carvalho, Adriana L. Soares, Ana Paula F. R. L. Bracarense, Paulo D. Guarnieri, Massami Shimokomaki, Elza I. Ida
Abstract:
Over the past decade, the Brazilian production of turkey meat increased by more than 50%, indicating that the turkey meat is considered a great potential for the Brazilian economy contributing to the growth of agribusiness at the marketing international scenario. However, significant color changes may occur during its processing leading to the pale, soft and exudative (PSE) appearance on the surface of breast meat due to the low water holding capacity (WHC). Changes in PSE meat functional properties occur due to the myofibrils proteins denaturation caused by a rapid postmortem glycolysis resulting in a rapid pH decline while the carcass temperature is still warm. The aim of this study was to analyze the physical, chemical and histological characteristics of PSE turkey meat obtained from a Brazilian commercial processing plant. The turkey breasts samples were collected (n=64) at the processing line and classified as PSE at L* ≥ 53 value. The pH was also analyzed after L* measurement. In sequence, PSE meat samples were evaluated for WHC, cooking loss (CL), shear force (SF), myofibril fragmentation index (MFI), protein denaturation (PD) and histological evaluation. The abnormal color samples presented lower pH values, 16% lower fiber diameter, 11% lower SF and 2% lower WHC than those classified as normal. The CL, PD and MFI were, respectively, 9%, 18% and 4% higher in PSE samples. The Pearson correlation between the L* values and CL, PD and MFI was positive, while that SF and pH values presented negative correlation. Under light microscopy, a shrinking of PSE muscle cell diameter was approximately 16% shorter in relation to normal samples and an extracellular enlargement of endomysium and perimysium sheaths as the consequence of higher water contents lost as observed previously by lower WHC values. Thus, the results showed that PSE turkey breast meat presented significant changes in their physical, chemical and histological characteristics that may impair its functional properties.Keywords: functional properties, histological evaluation, meat quality, PSE
Procedia PDF Downloads 4603683 The Incidental Linguistic Information Processing and Its Relation to General Intellectual Abilities
Authors: Evgeniya V. Gavrilova, Sofya S. Belova
Abstract:
The present study was aimed at clarifying the relationship between general intellectual abilities and efficiency in free recall and rhymed words generation task after incidental exposure to linguistic stimuli. The theoretical frameworks stress that general intellectual abilities are based on intentional mental strategies. In this context, it seems to be crucial to examine the efficiency of incidentally presented information processing in cognitive task and its relation to general intellectual abilities. The sample consisted of 32 Russian students. Participants were exposed to pairs of words. Each pair consisted of two common nouns or two city names. Participants had to decide whether a city name was presented in each pair. Thus words’ semantics was processed intentionally. The city names were considered to be focal stimuli, whereas common nouns were considered to be peripheral stimuli. Along with that each pair of words could be rhymed or not be rhymed, but this phonemic aspect of stimuli’s characteristic (rhymed and non-rhymed words) was processed incidentally. Then participants were asked to produce as many rhymes as they could to new words. The stimuli presented earlier could be used as well. After that, participants had to retrieve all words presented earlier. In the end, verbal and non-verbal abilities were measured with number of special psychometric tests. As for free recall task intentionally processed focal stimuli had an advantage in recall compared to peripheral stimuli. In addition all the rhymed stimuli were recalled more effectively than non-rhymed ones. The inverse effect was found in words generation task where participants tended to use mainly peripheral stimuli compared to focal ones. Furthermore peripheral rhymed stimuli were most popular target category of stimuli that was used in this task. Thus the information that was processed incidentally had a supplemental influence on efficiency of stimuli processing as well in free recall as in word generation task. Different patterns of correlations between intellectual abilities and efficiency in different stimuli processing in both tasks were revealed. Non-verbal reasoning ability correlated positively with free recall of peripheral rhymed stimuli, but it was not related to performance on rhymed words’ generation task. Verbal reasoning ability correlated positively with free recall of focal stimuli. As for rhymed words generation task, verbal intelligence correlated negatively with generation of focal stimuli and correlated positively with generation of all peripheral stimuli. The present findings lead to two key conclusions. First, incidentally processed stimuli had an advantage in free recall and word generation task. Thus incidental information processing appeared to be crucial for subsequent cognitive performance. Secondly, it was demonstrated that incidentally processed stimuli were recalled more frequently by participants with high nonverbal reasoning ability and were more effectively used by participants with high verbal reasoning ability in subsequent cognitive tasks. That implies that general intellectual abilities could benefit from operating by different levels of information processing while cognitive problem solving. This research was supported by the “Grant of President of RF for young PhD scientists” (contract № is 14.Z56.17.2980- MK) and the Grant № 15-36-01348a2 of Russian Foundation for Humanities.Keywords: focal and peripheral stimuli, general intellectual abilities, incidental information processing
Procedia PDF Downloads 2313682 Autonomous Ground Vehicle Navigation Based on a Single Camera and Image Processing Methods
Authors: Auday Al-Mayyahi, Phil Birch, William Wang
Abstract:
A vision system-based navigation for autonomous ground vehicle (AGV) equipped with a single camera in an indoor environment is presented. A proposed navigation algorithm has been utilized to detect obstacles represented by coloured mini- cones placed in different positions inside a corridor. For the recognition of the relative position and orientation of the AGV to the coloured mini cones, the features of the corridor structure are extracted using a single camera vision system. The relative position, the offset distance and steering angle of the AGV from the coloured mini-cones are derived from the simple corridor geometry to obtain a mapped environment in real world coordinates. The corridor is first captured as an image using the single camera. Hence, image processing functions are then performed to identify the existence of the cones within the environment. Using a bounding box surrounding each cone allows to identify the locations of cones in a pixel coordinate system. Thus, by matching the mapped and pixel coordinates using a projection transformation matrix, the real offset distances between the camera and obstacles are obtained. Real time experiments in an indoor environment are carried out with a wheeled AGV in order to demonstrate the validity and the effectiveness of the proposed algorithm.Keywords: autonomous ground vehicle, navigation, obstacle avoidance, vision system, single camera, image processing, ultrasonic sensor
Procedia PDF Downloads 3023681 Biogas Production from Pistachio (Pistacia vera L.) Processing Waste
Authors: İ. Çelik, Goksel Demirer
Abstract:
Turkey is the third largest producer of pistachio (Pistacia vera L.) after Iran and United States. Harvested pistachio nuts are covered with organic hull which is removed by de-hulling process. Most of the pistachio by-products which are produced during de-hulling process are considered as agricultural waste and often mixed with soil, to a lesser extent are used as feedstuff by local livestock farmers and a small portion is used as herbal medicine. Due to its high organic and phenolic content as well as high solids concentration, pistachio processing wastes create significant waste management problems unless they are properly managed. However, there is not a well-established waste management method compensating the waste generated during the processing of pistachios. This study investigated the anaerobic treatability and biogas generation potential of pistachio hull waste. The effect of pre-treatment on biogas generation potential was investigated. For this purpose, Biochemical Methane Potential (BMP) Assays were conducted for two Chemical Oxygen Demand (COD) concentrations of 22 and 33 g tCOD l-1 at the absence and presence of chemical and thermal pre-treatment methods. The results revealed anaerobic digestion of the pistachio de-hulling wastes and subsequent biogas production as a renewable energy source are possible. The observed percent COD removal and methane yield values of the pre-treated pistachio de-hulling waste samples were significantly higher than the raw pistachio de-hulling waste. The highest methane yield was observed as 213.4 ml CH4/g COD.Keywords: pistachio de-hulling waste, biogas, renewable energy, pre-treatment
Procedia PDF Downloads 2153680 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows
Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham
Abstract:
In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis
Procedia PDF Downloads 653679 A Cognitive Training Program in Learning Disability: A Program Evaluation and Follow-Up Study
Authors: Krisztina Bohacs, Klaudia Markus
Abstract:
To author’s best knowledge we are in absence of studies on cognitive program evaluation and we are certainly short of programs that prove to have high effect sizes with strong retention results. The purpose of our study was to investigate the effectiveness of a comprehensive cognitive training program, namely BrainRx. This cognitive rehabilitation program target and remediate seven core cognitive skills and related systems of sub-skills through repeated engagement in game-like mental procedures delivered one-on-one by a clinician, supplemented by digital training. A larger sample of children with learning disability were given pretest and post-test cognitive assessments. The experimental group completed a twenty-week cognitive training program in a BrainRx center. A matched control group received another twenty-week intervention with Feuerstein’s Instrumental Enrichment programs. A second matched control group did not receive training. As for pre- and post-test, we used a general intelligence test to assess IQ and a computer-based test battery for assessing cognition across the lifespan. Multiple regression analyses indicated that the experimental BrainRx treatment group had statistically significant higher outcomes in attention, working memory, processing speed, logic and reasoning, auditory processing, visual processing and long-term memory compared to the non-treatment control group with very large effect sizes. With the exception of logic and reasoning, the BrainRx treatment group realized significantly greater gains in six of the above given seven cognitive measures compared to the Feuerstein control group. Our one-year retention measures showed that all the cognitive training gains were above ninety percent with the greatest retention skills in visual processing, auditory processing, logic, and reasoning. The BrainRx program may be an effective tool to establish long-term cognitive changes in case of students with learning disabilities. Recommendations are made for treatment centers and special education institutions on the cognitive training of students with special needs. The importance of our study is that targeted, systematic, progressively loaded and intensive brain training approach may significantly change learning disabilities.Keywords: cognitive rehabilitation training, cognitive skills, learning disability, permanent structural cognitive changes
Procedia PDF Downloads 2023678 Study and Conservation of Cultural and Natural Heritages with the Use of Laser Scanner and Processing System for 3D Modeling Spatial Data
Authors: Julia Desiree Velastegui Caceres, Luis Alejandro Velastegui Caceres, Oswaldo Padilla, Eduardo Kirby, Francisco Guerrero, Theofilos Toulkeridis
Abstract:
It is fundamental to conserve sites of natural and cultural heritage with any available technique or existing methodology of preservation in order to sustain them for the following generations. We propose a further skill to protect the actual view of such sites, in which with high technology instrumentation we are able to digitally preserve natural and cultural heritages applied in Ecuador. In this project the use of laser technology is presented for three-dimensional models, with high accuracy in a relatively short period of time. In Ecuador so far, there are not any records on the use and processing of data obtained by this new technological trend. The importance of the project is the description of the methodology of the laser scanner system using the Faro Laser Scanner Focus 3D 120, the method for 3D modeling of geospatial data and the development of virtual environments in the areas of Cultural and Natural Heritage. In order to inform users this trend in technology in which three-dimensional models are generated, the use of such tools has been developed to be able to be displayed in all kinds of digitally formats. The results of the obtained 3D models allows to demonstrate that this technology is extremely useful in these areas, but also indicating that each data campaign needs an individual slightly different proceeding starting with the data capture and processing to obtain finally the chosen virtual environments.Keywords: laser scanner system, 3D model, cultural heritage, natural heritage
Procedia PDF Downloads 3063677 Physical Properties of Nine Nigerian Staple Food Flours Related to Bulk Handling and Processing
Authors: Ogunsina Babatunde, Aregbesola Omotayo, Adebayo Adewale, Odunlami Johnson
Abstract:
The physical properties of nine Nigerian staple food flours related to bulk handling and processing were investigated following standard procedures. The results showed that the moisture content, bulk density, angle of repose, water absorption capacity, swelling index, dispersability, pH and wettability of the flours ranged from 9.95 to 11.98%, 0.44 to 0.66 g/cm3, 31.43 to 39.65o, 198.3 to 291.7 g of water/100 g of sample, 5.53 to 7.63, 60.3 to 73.8%, 4.43 to 6.70, and 11 to 150 s. The particle size analysis of the flour samples indicated significant differences (p<0.05). The least gelation concentration of the flour samples ranged from 6 to 14%. The colour of the flours fell between light and saturated, with the exception of cassava, millet and maize flours which appear dark and dull. The properties of food flours depend largely on the inherent property of the food material and may influence their functional behaviour as food materials.Keywords: properties, flours, staple food, bulk handling
Procedia PDF Downloads 4813676 Gradient Boosted Trees on Spark Platform for Supervised Learning in Health Care Big Data
Authors: Gayathri Nagarajan, L. D. Dhinesh Babu
Abstract:
Health care is one of the prominent industries that generate voluminous data thereby finding the need of machine learning techniques with big data solutions for efficient processing and prediction. Missing data, incomplete data, real time streaming data, sensitive data, privacy, heterogeneity are few of the common challenges to be addressed for efficient processing and mining of health care data. In comparison with other applications, accuracy and fast processing are of higher importance for health care applications as they are related to the human life directly. Though there are many machine learning techniques and big data solutions used for efficient processing and prediction in health care data, different techniques and different frameworks are proved to be effective for different applications largely depending on the characteristics of the datasets. In this paper, we present a framework that uses ensemble machine learning technique gradient boosted trees for data classification in health care big data. The framework is built on Spark platform which is fast in comparison with other traditional frameworks. Unlike other works that focus on a single technique, our work presents a comparison of six different machine learning techniques along with gradient boosted trees on datasets of different characteristics. Five benchmark health care datasets are considered for experimentation, and the results of different machine learning techniques are discussed in comparison with gradient boosted trees. The metric chosen for comparison is misclassification error rate and the run time of the algorithms. The goal of this paper is to i) Compare the performance of gradient boosted trees with other machine learning techniques in Spark platform specifically for health care big data and ii) Discuss the results from the experiments conducted on datasets of different characteristics thereby drawing inference and conclusion. The experimental results show that the accuracy is largely dependent on the characteristics of the datasets for other machine learning techniques whereas gradient boosting trees yields reasonably stable results in terms of accuracy without largely depending on the dataset characteristics.Keywords: big data analytics, ensemble machine learning, gradient boosted trees, Spark platform
Procedia PDF Downloads 240