Search results for: thermomechanical processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3611

Search results for: thermomechanical processing

3251 High Pressure Processing of Jackfruit Bulbs: Effect on Color, Nutrient Profile and Enzyme Inactivation

Authors: Jyoti Kumari, Pavuluri Srinivasa Rao

Abstract:

Jackfruit (ArtocarpusheterophyllusL.) is an underutilized yet highly nutritious fruit with unique flavour, known for its therapeutic and culinary properties. Fresh jackfruit bulb has a very short shelf life due to high moisture and sugar content leading to microbial and enzymatic browning, hindering its consumer acceptability and marketability. An attempt has been made for the preservation of the ripe jackfruit bulbs, by the application of high pressure (HP) over a range of 200-500 MPa at ambient temperature for dwell times ranging from 5 to 20 min. The physicochemical properties of jackfruit bulbs such as the pH, TSS, and titrable acidity were not affected by the pressurization process. The ripening index of the fruit bulb also decreased following HP treatment. While the ascorbic acid and antioxidant activity of jackfruit bulb were well retained by high pressure processing (HPP), the total phenols and carotenoids showed a slight increase. The HPP significantly affected the colour and textural properties of jackfruit bulb. High pressure processing was highly effective in reducing the browning index of jackfruit bulbs in comparison to untreated bulbs. The firmness of the bulbs improved upon the pressure treatment with longer dwelling time. The polyphenol oxidase has been identified as the most prominent oxidative enzyme in the jackfruit bulb. The enzymatic activity of polyphenol oxidase and peroxidase were significantly reduced by up to 40% following treatment at 400 MPa/15 min. HPP of jackfruit bulbs at ambient temperatures is shown to be highly beneficial in improving the shelf stability, retaining its nutrient profile, color, and appearance while ensuring the maximum inactivation of the spoilage enzymes.

Keywords: antioxidant capacity, ascorbic acid, carotenoids, color, HPP-high pressure processing, jackfruit bulbs, polyphenol oxidase, peroxidase, total phenolic content

Procedia PDF Downloads 147
3250 The Effect of Speech-Shaped Noise and Speaker’s Voice Quality on First-Grade Children’s Speech Perception and Listening Comprehension

Authors: I. Schiller, D. Morsomme, A. Remacle

Abstract:

Children’s ability to process spoken language develops until the late teenage years. At school, where efficient spoken language processing is key to academic achievement, listening conditions are often unfavorable. High background noise and poor teacher’s voice represent typical sources of interference. It can be assumed that these factors particularly affect primary school children, because their language and literacy skills are still low. While it is generally accepted that background noise and impaired voice impede spoken language processing, there is an increasing need for analyzing impacts within specific linguistic areas. Against this background, the aim of the study was to investigate the effect of speech-shaped noise and imitated dysphonic voice on first-grade primary school children’s speech perception and sentence comprehension. Via headphones, 5 to 6-year-old children, recruited within the French-speaking community of Belgium, listened to and performed a minimal-pair discrimination task and a sentence-picture matching task. Stimuli were randomly presented according to four experimental conditions: (1) normal voice / no noise, (2) normal voice / noise, (3) impaired voice / no noise, and (4) impaired voice / noise. The primary outcome measure was task score. How did performance vary with respect to listening condition? Preliminary results will be presented with respect to speech perception and sentence comprehension and carefully interpreted in the light of past findings. This study helps to support our understanding of children’s language processing skills under adverse conditions. Results shall serve as a starting point for probing new measures to optimize children’s learning environment.

Keywords: impaired voice, sentence comprehension, speech perception, speech-shaped noise, spoken language processing

Procedia PDF Downloads 165
3249 DNA Multiplier: A Design Architecture of a Multiplier Circuit Using DNA Molecules

Authors: Hafiz Md. Hasan Babu, Khandaker Mohammad Mohi Uddin, Nitish Biswas, Sarreha Tasmin Rikta, Nuzmul Hossain Nahid

Abstract:

Nanomedicine and bioengineering use biological systems that can perform computing operations. In a biocomputational circuit, different types of biomolecules and DNA (Deoxyribose Nucleic Acid) are used as active components. DNA computing has the capability of performing parallel processing and a large storage capacity that makes it diverse from other computing systems. In most processors, the multiplier is treated as a core hardware block, and multiplication is one of the time-consuming and lengthy tasks. In this paper, cost-effective DNA multipliers are designed using algorithms of molecular DNA operations with respect to conventional ones. The speed and storage capacity of a DNA multiplier are also much higher than a traditional silicon-based multiplier.

Keywords: biological systems, DNA multiplier, large storage, parallel processing

Procedia PDF Downloads 170
3248 Benchmarking Bert-Based Low-Resource Language: Case Uzbek NLP Models

Authors: Jamshid Qodirov, Sirojiddin Komolov, Ravilov Mirahmad, Olimjon Mirzayev

Abstract:

Nowadays, natural language processing tools play a crucial role in our daily lives, including various techniques with text processing. There are very advanced models in modern languages, such as English, Russian etc. But, in some languages, such as Uzbek, the NLP models have been developed recently. Thus, there are only a few NLP models in Uzbek language. Moreover, there is no such work that could show which Uzbek NLP model behaves in different situations and when to use them. This work tries to close this gap and compares the Uzbek NLP models existing as of the time this article was written. The authors try to compare the NLP models in two different scenarios: sentiment analysis and sentence similarity, which are the implementations of the two most common problems in the industry: classification and similarity. Another outcome from this work is two datasets for classification and sentence similarity in Uzbek language that we generated ourselves and can be useful in both industry and academia as well.

Keywords: NLP, benchmak, bert, vectorization

Procedia PDF Downloads 27
3247 A Bottleneck-Aware Power Management Scheme in Heterogeneous Processors for Web Apps

Authors: Inyoung Park, Youngjoo Woo, Euiseong Seo

Abstract:

With the advent of WebGL, Web apps are now able to provide high quality graphics by utilizing the underlying graphic processing units (GPUs). Despite that the Web apps are becoming common and popular, the current power management schemes, which were devised for the conventional native applications, are suboptimal for Web apps because of the additional layer, the Web browser, between OS and application. The Web browser running on a CPU issues GL commands, which are for rendering images to be displayed by the Web app currently running, to the GPU and the GPU processes them. The size and number of issued GL commands determine the processing load of the GPU. While the GPU is processing the GL commands, CPU simultaneously executes the other compute intensive threads. The actual user experience will be determined by either CPU processing or GPU processing depending on which of the two is the more demanded resource. For example, when the GPU work queue is saturated by the outstanding commands, lowering the performance level of the CPU does not affect the user experience because it is already deteriorated by the retarded execution of GPU commands. Consequently, it would be desirable to lower CPU or GPU performance level to save energy when the other resource is saturated and becomes a bottleneck in the execution flow. Based on this observation, we propose a power management scheme that is specialized for the Web app runtime environment. This approach incurs two technical challenges; identification of the bottleneck resource and determination of the appropriate performance level for unsaturated resource. The proposed power management scheme uses the CPU utilization level of the Window Manager to tell which one is the bottleneck if exists. The Window Manager draws the final screen using the processed results delivered from the GPU. Thus, the Window Manager is on the critical path that determines the quality of user experience and purely executed by the CPU. The proposed scheme uses the weighted average of the Window Manager utilization to prevent excessive sensitivity and fluctuation. We classified Web apps into three categories using the analysis results that measure frame-per-second (FPS) changes under diverse CPU/GPU clock combinations. The results showed that the capability of the CPU decides user experience when the Window Manager utilization is above 90% and consequently, the proposed scheme decreases the performance level of CPU by one step. On the contrary, when its utilization is less than 60%, the bottleneck usually lies in the GPU and it is desirable to decrease the performance of GPU. Even the processing unit that is not on critical path, excessive performance drop can occur and that may adversely affect the user experience. Therefore, our scheme lowers the frequency gradually, until it finds an appropriate level by periodically checking the CPU utilization. The proposed scheme reduced the energy consumption by 10.34% on average in comparison to the conventional Linux kernel, and it worsened their FPS by 1.07% only on average.

Keywords: interactive applications, power management, QoS, Web apps, WebGL

Procedia PDF Downloads 167
3246 Improvement of Piezoresistive Pressure Sensor Accuracy by Means of Current Loop Circuit Using Optimal Digital Signal Processing

Authors: Peter A. L’vov, Roman S. Konovalov, Alexey A. L’vov

Abstract:

The paper presents the advanced digital modification of the conventional current loop circuit for pressure piezoelectric transducers. The optimal DSP algorithms of current loop responses by the maximum likelihood method are applied for diminishing of measurement errors. The loop circuit has some additional advantages such as the possibility to operate with any type of resistance or reactance sensors, and a considerable increase in accuracy and quality of measurements to be compared with AC bridges. The results obtained are dedicated to replace high-accuracy and expensive measuring bridges with current loop circuits.

Keywords: current loop, maximum likelihood method, optimal digital signal processing, precise pressure measurement

Procedia PDF Downloads 502
3245 Influence of Thermal Treatments on Ovomucoid as Allergenic Protein

Authors: Nasser A. Al-Shabib

Abstract:

Food allergens are most common non-native form when exposed to the immune system. Most food proteins undergo various treatments (e.g. thermal or proteolytic processing) during food manufacturing. Such treatments have the potential to impact the chemical structure of food allergens so as to convert them to more denatured or unfolded forms. The conformational changes in the proteins may affect the allergenicity of treated-allergens. However, most allergenic proteins possess high resistance against thermal modification or digestive enzymes. In the present study, ovomucoid (a major allergenic protein of egg white) was heated in phosphate-buffered saline (pH 7.4) at different temperatures, aqueous solutions and on different surfaces for various times. The results indicated that different antibody-based methods had different sensitivities in detecting the heated ovomucoid. When using one particular immunoassay‚ the immunoreactivity of ovomucoid increased rapidly after heating in water whereas immunoreactivity declined after heating in alkaline buffer (pH 10). Ovomucoid appeared more immunoreactive when dissolved in PBS (pH 7.4) and heated on a stainless steel surface. To the best of our knowledge‚ this is the first time that antibody-based methods have been applied for the detection of ovomucoid adsorbed onto different surfaces under various conditions. The results obtained suggest that use of antibodies to detect ovomucoid after food processing may be problematic. False assurance will be given with the use of inappropriate‚ non-validated immunoassays such as those available commercially as ‘Swab’ tests. A greater understanding of antibody-protein interaction after processing of a protein is required.

Keywords: ovomucoid, thermal treatment, solutions, surfaces

Procedia PDF Downloads 426
3244 Scheduling in Cloud Networks Using Chakoos Algorithm

Authors: Masoumeh Ali Pouri, Hamid Haj Seyyed Javadi

Abstract:

Nowadays, cloud processing is one of the important issues in information technology. Since scheduling of tasks graph is an NP-hard problem, considering approaches based on undeterminisitic methods such as evolutionary processing, mostly genetic and cuckoo algorithms, will be effective. Therefore, an efficient algorithm has been proposed for scheduling of tasks graph to obtain an appropriate scheduling with minimum time. In this algorithm, the new approach is based on making the length of the critical path shorter and reducing the cost of communication. Finally, the results obtained from the implementation of the presented method show that this algorithm acts the same as other algorithms when it faces graphs without communication cost. It performs quicker and better than some algorithms like DSC and MCP algorithms when it faces the graphs involving communication cost.

Keywords: cloud computing, scheduling, tasks graph, chakoos algorithm

Procedia PDF Downloads 36
3243 Twitter Sentiment Analysis during the Lockdown on New-Zealand

Authors: Smah Almotiri

Abstract:

One of the most common fields of natural language processing (NLP) is sentimental analysis. The inferred feeling in the text can be successfully mined for various events using sentiment analysis. Twitter is viewed as a reliable data point for sentimental analytics studies since people are using social media to receive and exchange different types of data on a broad scale during the COVID-19 epidemic. The processing of such data may aid in making critical decisions on how to keep the situation under control. The aim of this research is to look at how sentimental states differed in a single geographic region during the lockdown at two different times.1162 tweets were analyzed related to the COVID-19 pandemic lockdown using keywords hashtags (lockdown, COVID-19) for the first sample tweets were from March 23, 2020, until April 23, 2020, and the second sample for the following year was from March 1, 2020, until April 4, 2020. Natural language processing (NLP), which is a form of Artificial intelligence, was used for this research to calculate the sentiment value of all of the tweets by using AFINN Lexicon sentiment analysis method. The findings revealed that the sentimental condition in both different times during the region's lockdown was positive in the samples of this study, which are unique to the specific geographical area of New Zealand. This research suggests applying machine learning sentimental methods such as Crystal Feel and extending the size of the sample tweet by using multiple tweets over a longer period of time.

Keywords: sentiment analysis, Twitter analysis, lockdown, Covid-19, AFINN, NodeJS

Procedia PDF Downloads 158
3242 Performance of Hybrid Image Fusion: Implementation of Dual-Tree Complex Wavelet Transform Technique

Authors: Manoj Gupta, Nirmendra Singh Bhadauria

Abstract:

Most of the applications in image processing require high spatial and high spectral resolution in a single image. For example satellite image system, the traffic monitoring system, and long range sensor fusion system all use image processing. However, most of the available equipment is not capable of providing this type of data. The sensor in the surveillance system can only cover the view of a small area for a particular focus, yet the demanding application of this system requires a view with a high coverage of the field. Image fusion provides the possibility of combining different sources of information. In this paper, we have decomposed the image using DTCWT and then fused using average and hybrid of (maxima and average) pixel level techniques and then compared quality of both the images using PSNR.

Keywords: image fusion, DWT, DT-CWT, PSNR, average image fusion, hybrid image fusion

Procedia PDF Downloads 574
3241 An Experimental Study on the Variability of Nonnative and Native Inference of Word Meanings in Timed and Untimed Conditions

Authors: Swathi M. Vanniarajan

Abstract:

Reading research suggests that online contextual vocabulary comprehension while reading is an interactive and integrative process. One’s success in it depends on a variety of factors including the amount and the nature of available linguistic and nonlinguistic cues, his/her analytical and integrative skills, schema memory (content familiarity), and processing speed characterized along the continuum of controlled to automatic processing. The experiment reported here, conducted with 30 native speakers as one group and 30 nonnative speakers as another group (all graduate students), hypothesized that while working on (24) tasks which required them to comprehend an unfamiliar word in real time without backtracking, due to the differences in the nature of their respective reading processes, the nonnative subjects would be less able to construct the meanings of the unknown words by integrating the multiple but sufficient contextual cues provided in the text but the native subjects would be able to. The results indicated that there were significant inter-group as well as intra-group differences in terms of the quality of definitions given. However, when given additional time, while the nonnative speakers could significantly improve the quality of their definitions, the native speakers in general would not, suggesting that all things being equal, time is a significant factor for success in nonnative vocabulary and reading comprehension processes and that accuracy precedes automaticity in the development of nonnative reading processes also.

Keywords: reading, second language processing, vocabulary comprehension

Procedia PDF Downloads 142
3240 Perceiving Text-Worlds as a Cognitive Mechanism to Understand Surah Al-Kahf

Authors: Awatef Boubakri, Khaled Jebahi

Abstract:

Using Text World Theory (TWT), we attempted to understand how mental representations (text worlds) and perceptions can be construed by readers of Quranic texts. To this end, Surah Al-Kahf was purposefully selected given the fact that while each of its stories is narrated, different levels of discourse intervene, which might result in a confused reader who might find it hard to keep track of which discourse he or she is processing. This surah was studied using specifically-designed text-world diagrams. The findings suggest that TWT can be used to help solve problems of ambiguity at the level of discourse in Quranic texts and to help construct a thinking reader whose cognitive constructs (text worlds / mental representations) are built through reflecting on the various and often changing components of discourse world, text world, and sub-worlds.

Keywords: Al-Kahf, Surah, cognitive, processing, discourse

Procedia PDF Downloads 57
3239 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements

Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria

Abstract:

The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.

Keywords: strut and tie, optimization, reinforcement, massive structure

Procedia PDF Downloads 120
3238 Automated End-to-End Pipeline Processing Solution for Autonomous Driving

Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi

Abstract:

Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.

Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing

Procedia PDF Downloads 79
3237 A Pull-Out Fiber/Matrix Interface Characterization of Vegetal Fibers Reinforced Thermoplastic Polymer Composites, the Influence of the Processing Temperature

Authors: Duy Cuong Nguyen, Ali Makke, Guillaume Montay

Abstract:

This work presents an improved single fiber pull-out test for fiber/matrix interface characterization. This test has been used to study the Inter-Facial Shear Strength ‘IFSS’ of hemp fibers reinforced polypropylene (PP). For this aim, the fiber diameter has been carefully measured using a tomography inspired method. The fiber section contour can then be approximated by a circle or a polygon. The results show that the IFSS is overestimated if the circular approximation is used. The Influence of the molding temperature on the IFSS has also been studied. We find a molding temperature of 183°C leads to better interface properties. Above or below this temperature the interface strength is reduced.

Keywords: composite, hemp, interface, pull-out, processing, polypropylene, temperature

Procedia PDF Downloads 365
3236 Quantification of Peptides (linusorbs) in Gluten-free Flaxseed Fortified Bakery Products

Authors: Youn Young Shim, Ji Hye Kim, Jae Youl Cho, Martin JT Reaney

Abstract:

Flaxseed (Linumusitatissimum L.) is gaining popularity in the food industry as a superfood due to its health-promoting properties. Linusorbs (LOs, a.k.a. Cyclolinopeptide) are bioactive compounds present in flaxseed exhibiting potential health effects. The study focused on the effects of processing and storage on the stability of flaxseed-derived LOs added to various bakery products. The flaxseed meal fortified gluten-free (GF) bakery bread was prepared, and the changes of LOs during the bread-making process (meal, fortified flour, dough, and bread) and storage (0, 1, 2, and 4 weeks) at different temperatures (−18 °C, 4 °C, and 22−23 °C) were analyzed by high-performance liquid chromatography-diode array detection. The total oxidative LOs and LO1OB2 were almost kept stable in flaxseed meals at storage temperatures of 22−23 °C, −18 °C, and 4 °C for up to four weeks. Processing steps during GF-bread production resulted in the oxidation of LOs. Interestingly, no LOs were detected in the dough sample; however, LOs appeared when the dough was stored at −18 °C for one week, suggesting that freezing destroyed the sticky structure of the dough and resulted in the release of LOs. The final product, flaxseed meal fortified bread, could be stored for up to four weeks at −18 °C and 4 °C, and for one week at 22−23 °C. All these results suggested that LOs may change during processing and storage and that flaxseed flour-fortified bread should be stored at low temperatures to preserve effective LOs components.

Keywords: linum usitatissimum L., flaxseed, linusorb, stability, gluten-free, peptides, cyclolinopeptide

Procedia PDF Downloads 155
3235 Use of Satellite Imaging to Understand Earth’s Surface Features: A Roadmap

Authors: Sabri Serkan Gulluoglu

Abstract:

It is possible with Geographic Information Systems (GIS) that the information about all natural and artificial resources on the earth is obtained taking advantage of satellite images are obtained by remote sensing techniques. However, determination of unknown sources, mapping of the distribution and efficient evaluation of resources are defined may not be possible with the original image. For this reasons, some process steps are needed like transformation, pre-processing, image enhancement and classification to provide the most accurate assessment numerically and visually. Many studies which present the phases of obtaining and processing of the satellite images have examined in the literature study. The research showed that the determination of the process steps may be followed at this subject with the existence of a common whole may provide to progress the process rapidly for the necessary and possible studies which will be.

Keywords: remote sensing, satellite imaging, gis, computer science, information

Procedia PDF Downloads 294
3234 Trabecular Texture Analysis Using Fractal Metrics for Bone Fragility Assessment

Authors: Khaled Harrar, Rachid Jennane

Abstract:

The purpose of this study is the discrimination of 28 postmenopausal with osteoporotic femoral fractures from an age-matched control group of 28 women using texture analysis based on fractals. Two pre-processing approaches are applied on radiographic images; these techniques are compared to highlight the choice of the pre-processing method. Furthermore, the values of the fractal dimension are compared to those of the fractal signature in terms of the classification of the two populations. In a second analysis, the BMD measure at proximal femur was compared to the fractal analysis, the latter, which is a non-invasive technique, allowed a better discrimination; the results confirm that the fractal analysis of texture on calcaneus radiographs is able to discriminate osteoporotic patients with femoral fracture from controls. This discrimination was efficient compared to that obtained by BMD alone. It was also present in comparing subgroups with overlapping values of BMD.

Keywords: osteoporosis, fractal dimension, fractal signature, bone mineral density

Procedia PDF Downloads 396
3233 The Application of to Optimize Pellet Quality in Broiler Feeds

Authors: Reza Vakili

Abstract:

The aim of this experiment was to optimize the effect of moisture, the production rate, grain particle size and steam conditioning temperature on pellet quality in broiler feed using Taguchi method and a 43 fractional factorial arrangement was conducted. Production rate, steam conditioning temperatures, particle sizes and moisture content were performed. During the production process, sampling was done, and then pellet durability index (PDI) and hardness evaluated in broiler feed grower and finisher. There was a significant effect of processing parameters on PDI and hardness. Based on the results of this experiment Taguchi method can be used to find the best combination of factors for optimal pellet quality.

Keywords: broiler, feed physical quality, hardness, processing parameters, PDI

Procedia PDF Downloads 145
3232 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing

Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou

Abstract:

The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.

Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation

Procedia PDF Downloads 83
3231 Modification of Polymer Composite Based on Electromagnetic Radiation

Authors: Ananta R. Adhikari

Abstract:

In today's era, polymer composite utilization has witnessed a significant increase across various fronts of material science advancement. Despite the development of many highly sophisticated technologies aimed at modifying polymer composites, there persists a quest for a technology that is straightforward, energy-efficient, easily controllable, cost-effective, time-saving, and environmentally friendly. Microwave technology has emerged as a major technique in material synthesis and modification due to its unique characteristics such as rapid, selective, uniform heating, and, particularly, direct heating based on molecular interaction. This study will be about the utilization of microwave energy as an alternative technique for material processing. Specifically, we will explore ongoing research conducted in our laboratory, focusing on its applications in the medical field.

Keywords: polymer composites, material processing, microstructure, microwave radiation

Procedia PDF Downloads 19
3230 Economized Sensor Data Processing with Vehicle Platooning

Authors: Henry Hexmoor, Kailash Yelasani

Abstract:

We present vehicular platooning as a special case of crowd-sensing framework where sharing sensory information among a crowd is used for their collective benefit. After offering an abstract policy that governs processes involving a vehicular platoon, we review several common scenarios and components surrounding vehicular platooning. We then present a simulated prototype that illustrates efficiency of road usage and vehicle travel time derived from platooning. We have argued that one of the paramount benefits of platooning that is overlooked elsewhere, is the substantial computational savings (i.e., economizing benefits) in acquisition and processing of sensory data among vehicles sharing the road. The most capable vehicle can share data gathered from its sensors with nearby vehicles grouped into a platoon.

Keywords: cloud network, collaboration, internet of things, social network

Procedia PDF Downloads 167
3229 Modalmetric Fiber Sensor and Its Applications

Authors: M. Zyczkowski, P. Markowski, M. Karol

Abstract:

The team from IOE MUT is developing fiber optic sensors for the security systems for 15 years. The conclusions of the work indicate that these sensors are complicated. Moreover, these sensors are expensive to produce and require sophisticated signal processing methods.We present the results of the investigations of three different applications of the modalmetric sensor: • Protection of museum collections and heritage buildings, • Protection of fiber optic transmission lines, • Protection of objects of critical infrastructure. Each of the presented applications involves different requirements for the system. The results indicate that it is possible to developed a fiber optic sensor based on a single fiber. Modification of optoelectronic parts with a change of the length of the sensor and the method of reflections of propagating light at the end of the sensor allows to adjust the system to the specific application.

Keywords: modalmetric fiber optic sensor, security sensor, optoelectronic parts, signal processing

Procedia PDF Downloads 597
3228 Making Lightweight Concrete with Meerschaum

Authors: H. Gonen, M. Dogan

Abstract:

Meerschaum, which is found in the earth’s crust, is a white and clay like hydrous magnesium silicate. It has a wide area of use from production of carious ornaments to chemical industry. It has a white and irregular crystalline structure. It is wet and moist when extracted, which is a good form for processing. At drying phase, it gradually loses its moisture and becomes lighter and harder. In through-dry state, meerschaum is durable and floats on the water. After processing of meerschaum, A ratio between %15 to %40 of the amount becomes waste. This waste is usually kept in a dry-atmosphere which is isolated from environmental effects so that to be used right away when needed. In this study, use of meerschaum waste as aggregate in lightweight concrete is studied. Stress-strain diagrams for concrete with meerschaum aggregate are obtained. Then, stress-strain diagrams of lightweight concrete and concrete with regular aggregate are compared. It is concluded that meerschaum waste can be used in production of lightweight concrete.

Keywords: lightweight concrete, meerschaum, aggregate, sepiolite, stress-strain diagram

Procedia PDF Downloads 573
3227 Hydrodynamic Analysis of Fish Fin Kinematics of Oreochromis Niloticus Using Machine Learning and Image Processing

Authors: Paramvir Singh

Abstract:

The locomotion of aquatic organisms has long fascinated biologists and engineers alike, with fish fins serving as a prime example of nature's remarkable adaptations for efficient underwater propulsion. This paper presents a comprehensive study focused on the hydrodynamic analysis of fish fin kinematics, employing an innovative approach that combines machine learning and image processing techniques. Through high-speed videography and advanced computational tools, we gain insights into the complex and dynamic motion of the fins of a Tilapia (Oreochromis Niloticus) fish. This study was initially done by experimentally capturing videos of the various motions of a Tilapia in a custom-made setup. Using deep learning and image processing on the videos, the motion of the Caudal and Pectoral fin was extracted. This motion included the fin configuration (i.e., the angle of deviation from the mean position) with respect to time. Numerical investigations for the flapping fins are then performed using a Computational Fluid Dynamics (CFD) solver. 3D models of the fins were created, mimicking the real-life geometry of the fins. Thrust Characteristics of separate fins (i.e., Caudal and Pectoral separately) and when the fins are together were studied. The relationship and the phase between caudal and pectoral fin motion were also discussed. The key objectives include mathematical modeling of the motion of a flapping fin at different naturally occurring frequencies and amplitudes. The interactions between both fins (caudal and pectoral) were also an area of keen interest. This work aims to improve on research that has been done in the past on similar topics. Also, these results can help in the better and more efficient design of the propulsion systems for biomimetic underwater vehicles that are used to study aquatic ecosystems, explore uncharted or challenging underwater regions, do ocean bed modeling, etc.

Keywords: biomimetics, fish fin kinematics, image processing, fish tracking, underwater vehicles

Procedia PDF Downloads 48
3226 Using Deep Learning Real-Time Object Detection Convolution Neural Networks for Fast Fruit Recognition in the Tree

Authors: K. Bresilla, L. Manfrini, B. Morandi, A. Boini, G. Perulli, L. C. Grappadelli

Abstract:

Image/video processing for fruit in the tree using hard-coded feature extraction algorithms have shown high accuracy during recent years. While accurate, these approaches even with high-end hardware are computationally intensive and too slow for real-time systems. This paper details the use of deep convolution neural networks (CNNs), specifically an algorithm (YOLO - You Only Look Once) with 24+2 convolution layers. Using deep-learning techniques eliminated the need for hard-code specific features for specific fruit shapes, color and/or other attributes. This CNN is trained on more than 5000 images of apple and pear fruits on 960 cores GPU (Graphical Processing Unit). Testing set showed an accuracy of 90%. After this, trained data were transferred to an embedded device (Raspberry Pi gen.3) with camera for more portability. Based on correlation between number of visible fruits or detected fruits on one frame and the real number of fruits on one tree, a model was created to accommodate this error rate. Speed of processing and detection of the whole platform was higher than 40 frames per second. This speed is fast enough for any grasping/harvesting robotic arm or other real-time applications.

Keywords: artificial intelligence, computer vision, deep learning, fruit recognition, harvesting robot, precision agriculture

Procedia PDF Downloads 388
3225 Fractal-Wavelet Based Techniques for Improving the Artificial Neural Network Models

Authors: Reza Bazargan lari, Mohammad H. Fattahi

Abstract:

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for pre-processing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based pre-processing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Keywords: wavelet, de-noising, predictability, time series fractal analysis, valid length, ANN

Procedia PDF Downloads 345
3224 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 44
3223 Contribution to the Evaluation of Uncertainties of Measurement to the Data Processing Sequences of a Cmm

Authors: Hassina Gheribi, Salim Boukebbab

Abstract:

The measurement of the parts manufactured on CMM (coordinate measuring machine) is based on the association of a surface of perfect geometry to the group of dots palpated via a mathematical calculation of the distances between the palpated points and itself surfaces. Surfaces not being never perfect, they are measured by a number of points higher than the minimal number necessary to define them mathematically. However, the central problems of three-dimensional metrology are the estimate of, the orientation parameters, location and intrinsic of this surface. Including the numerical uncertainties attached to these parameters help the metrologist to make decisions to be able to declare the conformity of the part to specifications fixed on the design drawing. During this paper, we will present a data-processing model in Visual Basic-6 which makes it possible automatically to determine the whole of these parameters, and their uncertainties.

Keywords: coordinate measuring machines (CMM), associated surface, uncertainties of measurement, acquisition and modeling

Procedia PDF Downloads 294
3222 Mapping of Alteration Zones in Mineral Rich Belt of South-East Rajasthan Using Remote Sensing Techniques

Authors: Mrinmoy Dhara, Vivek K. Sengar, Shovan L. Chattoraj, Soumiya Bhattacharjee

Abstract:

Remote sensing techniques have emerged as an asset for various geological studies. Satellite images obtained by different sensors contain plenty of information related to the terrain. Digital image processing further helps in customized ways for the prospecting of minerals. In this study, an attempt has been made to map the hydrothermally altered zones using multispectral and hyperspectral datasets of South East Rajasthan. Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER) and Hyperion (Level1R) dataset have been processed to generate different Band Ratio Composites (BRCs). For this study, ASTER derived BRCs were generated to delineate the alteration zones, gossans, abundant clays and host rocks. ASTER and Hyperion images were further processed to extract mineral end members and classified mineral maps have been produced using Spectral Angle Mapper (SAM) method. Results were validated with the geological map of the area which shows positive agreement with the image processing outputs. Thus, this study concludes that the band ratios and image processing in combination play significant role in demarcation of alteration zones which may provide pathfinders for mineral prospecting studies.

Keywords: ASTER, hyperion, band ratios, alteration zones, SAM

Procedia PDF Downloads 256