Search results for: algebraic signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5162

Search results for: algebraic signal processing

4472 A Cooperative Signaling Scheme for Global Navigation Satellite Systems

Authors: Keunhong Chae, Seokho Yoon

Abstract:

Recently, the global navigation satellite system (GNSS) such as Galileo and GPS is employing more satellites to provide a higher degree of accuracy for the location service, thus calling for a more efficient signaling scheme among the satellites used in the overall GNSS network. In that the network throughput is improved, the spatial diversity can be one of the efficient signaling schemes; however, it requires multiple antenna that could cause a significant increase in the complexity of the GNSS. Thus, a diversity scheme called the cooperative signaling was proposed, where the virtual multiple-input multiple-output (MIMO) signaling is realized with using only a single antenna in the transmit satellite of interest and with modeling the neighboring satellites as relay nodes. The main drawback of the cooperative signaling is that the relay nodes receive the transmitted signal at different time instants, i.e., they operate in an asynchronous way, and thus, the overall performance of the GNSS network could degrade severely. To tackle the problem, several modified cooperative signaling schemes were proposed; however, all of them are difficult to implement due to a signal decoding at the relay nodes. Although the implementation at the relay nodes could be simpler to some degree by employing the time-reversal and conjugation operations instead of the signal decoding, it would be more efficient if we could implement the operations of the relay nodes at the source node having more resources than the relay nodes. So, in this paper, we propose a novel cooperative signaling scheme, where the data signals are combined in a unique way at the source node, thus obviating the need of the complex operations such as signal decoding, time-reversal and conjugation at the relay nodes. The numerical results confirm that the proposed scheme provides the same performance in the cooperative diversity and the bit error rate (BER) as the conventional scheme, while reducing the complexity at the relay nodes significantly. Acknowledgment: This work was supported by the National GNSS Research Center program of Defense Acquisition Program Administration and Agency for Defense Development.

Keywords: global navigation satellite network, cooperative signaling, data combining, nodes

Procedia PDF Downloads 281
4471 Yoghurt Kepel Stelechocarpus burahol as an Effort of Functional Food Diversification from Region of Yogyakarta

Authors: Dian Nur Amalia, Rifqi Dhiemas Aji, Tri Septa Wahyuningsih, Endang Wahyuni

Abstract:

Kepel fruit (Stelechocarpus burahol) is a scarce fruit that belongs as a logogram of Daerah Istimewa Yogyakarta. Kepel fruit can be used as substance of beauty treatment product, such as deodorant and good for skin health, and also contains antioxidant compound. Otherwise, this fruit is scarcely cultivated by people because of its image as a palace fruit and also the flesh percentage just a little, so it has low economic value. The flesh of kepel fruit is about 49% of its whole fruit. This little part as supporting point why kepel fruit has to be extracted and processed with the other product. Yoghurt is milk processing product that also have a role as functional food. Economically, the price of yoghurt is higher than whole milk or other milk processing product. Yoghurt is usually added with flavor of dye from plant or from chemical substance. Kepel fruit has a role as flavor in yoghurt, besides as product that good for digestion, yoghurt with kepel also has function as “beauty” food. Writing method that used is literature study by looking for the potential of kepel fruit as a local fruit of Yogyakarta and yoghurt as milk processing product. The process just like making common yoghurt because kepel fruit just have a role as flavor substance, so it does not affect to the other processing of yoghurt. Food diversification can be done as an effort to increase the value of local resources that proper to compete in Asean Economic Community (AEC), one of the way is producing kepel yoghurt.

Keywords: kepel, yoghurt, Daerah Istimewa Yogyakarta, functional food

Procedia PDF Downloads 320
4470 The Analogue of a Property of Pisot Numbers in Fields of Formal Power Series

Authors: Wiem Gadri

Abstract:

This study delves into the intriguing properties of Pisot and Salem numbers within the framework of formal Laurent series over finite fields, a domain where these numbers’ spectral charac-teristics, Λm(β) and lm(β), have yet to be fully explored. Utilizing a methodological approach that combines algebraic number theory with the analysis of power series, we extend the foundational work of Erdos, Joo, and Komornik to this new setting. Our research uncovers bounds for lm(β), revealing how these depend on the degree of the minimal polynomial of β and thus offering a novel characterization of Pisot and Salem formal power series. The findings significantly contribute to our understanding of these numbers, highlighting their distribution and properties in the context of formal power series. This investigation not only bridges number theory with formal power series analysis but also sets the stage for further interdisciplinary research in these areas.

Keywords: Pisot numbers, Salem numbers, formal power series, over a finite field

Procedia PDF Downloads 51
4469 A Comprehensive Analysis of the Phylogenetic Signal in Ramp Sequences in 211 Vertebrates

Authors: Lauren M. McKinnon, Justin B. Miller, Michael F. Whiting, John S. K. Kauwe, Perry G. Ridge

Abstract:

Background: Ramp sequences increase translational speed and accuracy when rare, slowly-translated codons are found at the beginnings of genes. Here, the results of the first analysis of ramp sequences in a phylogenetic construct are presented. Methods: Ramp sequences were compared from 211 vertebrates (110 Mammalian and 101 non-mammalian). The presence and absence of ramp sequences were analyzed as a binary character in a parsimony and maximum likelihood framework. Additionally, ramp sequences were mapped to the Open Tree of Life taxonomy to determine the number of parallelisms and reversals that occurred, and these results were compared to what would be expected due to random chance. Lastly, aligned nucleotides in ramp sequences were compared to the rest of the sequence in order to examine possible differences in phylogenetic signal between these regions of the gene. Results: Parsimony and maximum likelihood analyses of the presence/absence of ramp sequences recovered phylogenies that are highly congruent with established phylogenies. Additionally, the retention index of ramp sequences is significantly higher than would be expected due to random chance (p-value = 0). A chi-square analysis of completely orthologous ramp sequences resulted in a p-value of approximately zero as compared to random chance. Discussion: Ramp sequences recover comparable phylogenies as other phylogenomic methods. Although not all ramp sequences appear to have a phylogenetic signal, more ramp sequences track speciation than expected by random chance. Therefore, ramp sequences may be used in conjunction with other phylogenomic approaches.

Keywords: codon usage bias, phylogenetics, phylogenomics, ramp sequence

Procedia PDF Downloads 163
4468 Poincare Plot for Heart Rate Variability

Authors: Mazhar B. Tayel, Eslam I. AlSaba

Abstract:

The heart is the most important part in any body organisms. It effects and affected by any factor in the body. Therefore, it is a good detector of any matter in the body. When the heart signal is non-stationary signal, therefore, it should be study its variability. So, the Heart Rate Variability (HRV) has attracted considerable attention in psychology, medicine and have become important dependent measure in psychophysiology and behavioral medicine. Quantification and interpretation of heart rate variability. However, remain complex issues are fraught with pitfalls. This paper presents one of the non-linear techniques to analyze HRV. It discusses 'What Poincare plot is?', 'How it is work?', 'its usage benefits especially in HRV', 'the limitation of Poincare cause of standard deviation SD1, SD2', and 'How overcome this limitation by using complex correlation measure (CCM)'. The CCM is most sensitive to changes in temporal structure of the Poincaré plot as compared to SD1 and SD2.

Keywords: heart rate variability, chaotic system, poincare, variance, standard deviation, complex correlation measure

Procedia PDF Downloads 401
4467 Modeling Exponential Growth Activity Using Technology: A Research with Bachelor of Business Administration Students

Authors: V. Vargas-Alejo, L. E. Montero-Moguel

Abstract:

Understanding the concept of function has been important in mathematics education for many years. In this study, the models built by a group of five business administration and accounting undergraduate students when carrying out a population growth activity are analyzed. The theoretical framework is the Models and Modeling Perspective. The results show how the students included tables, graphics, and algebraic representations in their models. Using technology was useful to interpret, describe, and predict the situation. The first model, the students built to describe the situation, was linear. After that, they modified and refined their ways of thinking; finally, they created exponential growth. Modeling the activity was useful to deep on mathematical concepts such as covariation, rate of change, and exponential function also to differentiate between linear and exponential growth.

Keywords: covariation reasoning, exponential function, modeling, representations

Procedia PDF Downloads 120
4466 Design of New Alloys from Al-Ti-Zn-Mg-Cu System by in situ Al3Ti Formation

Authors: Joao Paulo De Oliveira Paschoal, Andre Victor Rodrigues Dantas, Fernando Almeida Da Silva Fernandes, Eugenio Jose Zoqui

Abstract:

With the adoption of High Pressure Die Casting technologies for the production of automotive bodies by the famous Giga Castings, the technology of processing metal alloys in the semi-solid state (SSM) becomes interesting because it allows for higher product quality, such as lower porosity and shrinkage voids. However, the alloys currently processed are derived from the foundry industry and are based on the Al-Si-(Cu-Mg) system. High-strength alloys, such as those of the Al-Zn-Mg-Cu system, are not usually processed, but the benefits of using this system, which is susceptible to heat treatments, can be associated with the advantages obtained by processing in the semi-solid state, promoting new possibilities for production routes and improving product performance. The current work proposes a new range of alloys to be processed in the semi-solid state through the modification of aluminum alloys of the Al-Zn-Mg-Cu system by the in-situ formation of Al3Ti intermetallic. Such alloys presented the thermodynamic stability required for semi-solid processing, with a sensitivity below 0.03(Celsius degrees * -1), in a wide temperature range. Furthermore, these alloys presented high hardness after aging heat treatment, reaching 190HV. Therefore, they are excellent candidates for the manufacture of parts that require low levels of defects and high mechanical strength.

Keywords: aluminum alloys, semisolid metals processing, intermetallics, heat treatment, titanium aluminide

Procedia PDF Downloads 20
4465 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing

Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah

Abstract:

The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.

Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing

Procedia PDF Downloads 429
4464 A Hybrid Digital Watermarking Scheme

Authors: Nazish Saleem Abbas, Muhammad Haris Jamil, Hamid Sharif

Abstract:

Digital watermarking is a technique that allows an individual to add and hide secret information, copyright notice, or other verification message inside a digital audio, video, or image. Today, with the advancement of technology, modern healthcare systems manage patients’ diagnostic information in a digital way in many countries. When transmitted between hospitals through the internet, the medical data becomes vulnerable to attacks and requires security and confidentiality. Digital watermarking techniques are used in order to ensure the authenticity, security and management of medical images and related information. This paper proposes a watermarking technique that embeds a watermark in medical images imperceptibly and securely. In this work, digital watermarking on medical images is carried out using the Least Significant Bit (LSB) with the Discrete Cosine Transform (DCT). The proposed methods of embedding and extraction of a watermark in a watermarked image are performed in the frequency domain using LSB by XOR operation. The quality of the watermarked medical image is measured by the Peak signal-to-noise ratio (PSNR). It was observed that the watermarked medical image obtained performing XOR operation between DCT and LSB survived compression attack having a PSNR up to 38.98.

Keywords: watermarking, image processing, DCT, LSB, PSNR

Procedia PDF Downloads 51
4463 An Adaptive Decomposition for the Variability Analysis of Observation Time Series in Geophysics

Authors: Olivier Delage, Thierry Portafaix, Hassan Bencherif, Guillaume Guimbretiere

Abstract:

Most observation data sequences in geophysics can be interpreted as resulting from the interaction of several physical processes at several time and space scales. As a consequence, measurements time series in geophysics have often characteristics of non-linearity and non-stationarity and thereby exhibit strong fluctuations at all time-scales and require a time-frequency representation to analyze their variability. Empirical Mode Decomposition (EMD) is a relatively new technic as part of a more general signal processing method called the Hilbert-Huang transform. This analysis method turns out to be particularly suitable for non-linear and non-stationary signals and consists in decomposing a signal in an auto adaptive way into a sum of oscillating components named IMFs (Intrinsic Mode Functions), and thereby acts as a bank of bandpass filters. The advantages of the EMD technic are to be entirely data driven and to provide the principal variability modes of the dynamics represented by the original time series. However, the main limiting factor is the frequency resolution that may give rise to the mode mixing phenomenon where the spectral contents of some IMFs overlap each other. To overcome this problem, J. Gilles proposed an alternative entitled “Empirical Wavelet Transform” (EWT) which consists in building from the segmentation of the original signal Fourier spectrum, a bank of filters. The method used is based on the idea utilized in the construction of both Littlewood-Paley and Meyer’s wavelets. The heart of the method lies in the segmentation of the Fourier spectrum based on the local maxima detection in order to obtain a set of non-overlapping segments. Because linked to the Fourier spectrum, the frequency resolution provided by EWT is higher than that provided by EMD and therefore allows to overcome the mode-mixing problem. On the other hand, if the EWT technique is able to detect the frequencies involved in the original time series fluctuations, EWT does not allow to associate the detected frequencies to a specific mode of variability as in the EMD technic. Because EMD is closer to the observation of physical phenomena than EWT, we propose here a new technic called EAWD (Empirical Adaptive Wavelet Decomposition) based on the coupling of the EMD and EWT technics by using the IMFs density spectral content to optimize the segmentation of the Fourier spectrum required by EWT. In this study, EMD and EWT technics are described, then EAWD technic is presented. Comparison of results obtained respectively by EMD, EWT and EAWD technics on time series of ozone total columns recorded at Reunion island over [1978-2019] period is discussed. This study was carried out as part of the SOLSTYCE project dedicated to the characterization and modeling of the underlying dynamics of time series issued from complex systems in atmospheric sciences

Keywords: adaptive filtering, empirical mode decomposition, empirical wavelet transform, filter banks, mode-mixing, non-linear and non-stationary time series, wavelet

Procedia PDF Downloads 139
4462 Thermodynamic Analysis and Experimental Study of Agricultural Waste Plasma Processing

Authors: V. E. Messerle, A. B. Ustimenko, O. A. Lavrichshev

Abstract:

A large amount of manure and its irrational use negatively affect the environment. As compared with biomass fermentation, plasma processing of manure enhances makes it possible to intensify the process of obtaining fuel gas, which consists mainly of synthesis gas (CO + H₂), and increase plant productivity by 150–200 times. This is achieved due to the high temperature in the plasma reactor and a multiple reduction in waste processing time. This paper examines the plasma processing of biomass using the example of dried mixed animal manure (dung with a moisture content of 30%). Characteristic composition of dung, wt.%: Н₂О – 30, С – 29.07, Н – 4.06, О – 32.08, S – 0.26, N – 1.22, P₂O₅ – 0.61, K₂O – 1.47, СаО – 0.86, MgO – 0.37. The thermodynamic code TERRA was used to numerically analyze dung plasma gasification and pyrolysis. Plasma gasification and pyrolysis of dung were analyzed in the temperature range 300–3,000 K and pressure 0.1 MPa for the following thermodynamic systems: 100% dung + 25% air (plasma gasification) and 100% dung + 25% nitrogen (plasma pyrolysis). Calculations were conducted to determine the composition of the gas phase, the degree of carbon gasification, and the specific energy consumption of the processes. At an optimum temperature of 1,500 K, which provides both complete gasification of dung carbon and the maximum yield of combustible components (99.4 vol.% during dung gasification and 99.5 vol.% during pyrolysis), and decomposition of toxic compounds of furan, dioxin, and benz(a)pyrene, the following composition of combustible gas was obtained, vol.%: СО – 29.6, Н₂ – 35.6, СО₂ – 5.7, N₂ – 10.6, H₂O – 17.9 (gasification) and СО – 30.2, Н₂ – 38.3, СО₂ – 4.1, N₂ – 13.3, H₂O – 13.6 (pyrolysis). The specific energy consumption of gasification and pyrolysis of dung at 1,500 K is 1.28 and 1.33 kWh/kg, respectively. An installation with a DC plasma torch with a rated power of 100 kW and a plasma reactor with a dung capacity of 50 kg/h was used for dung processing experiments. The dung was gasified in an air (or nitrogen during pyrolysis) plasma jet, which provided a mass-average temperature in the reactor volume of at least 1,600 K. The organic part of the dung was gasified, and the inorganic part of the waste was melted. For pyrolysis and gasification of dung, the specific energy consumption was 1.5 kWh/kg and 1.4 kWh/kg, respectively. The maximum temperature in the reactor reached 1,887 K. At the outlet of the reactor, a gas of the following composition was obtained, vol.%: СO – 25.9, H₂ – 32.9, СO₂ – 3.5, N₂ – 37.3 (pyrolysis in nitrogen plasma); СO – 32.6, H₂ – 24.1, СO₂ – 5.7, N₂ – 35.8 (air plasma gasification). The specific heat of combustion of the combustible gas formed during pyrolysis and plasma-air gasification of agricultural waste is 10,500 and 10,340 kJ/kg, respectively. Comparison of the integral indicators of dung plasma processing showed satisfactory agreement between the calculation and experiment.

Keywords: agricultural waste, experiment, plasma gasification, thermodynamic calculation

Procedia PDF Downloads 42
4461 Open-Source YOLO CV For Detection of Dust on Solar PV Surface

Authors: Jeewan Rai, Kinzang, Yeshi Jigme Choden

Abstract:

Accumulation of dust on solar panels impacts the overall efficiency and the amount of energy they produce. While various techniques exist for detecting dust to schedule cleaning, many of these methods use MATLAB image processing tools and other licensed software, which can be financially burdensome. This study will investigate the efficiency of a free open-source computer vision library using the YOLO algorithm. The proposed approach has been tested on images of solar panels with varying dust levels through an experiment setup. The experimental findings illustrated the effectiveness of using the YOLO-based image classification method and the overall dust detection approach with an accuracy of 90% in distinguishing between clean and dusty panels. This open-source solution provides a cost effective and accessible alternative to commercial image processing tools, offering solutions for optimizing solar panel maintenance and enhancing energy production.

Keywords: YOLO, openCV, dust detection, solar panels, computer vision, image processing

Procedia PDF Downloads 36
4460 Operational Matrix Method for Fuzzy Fractional Reaction Diffusion Equation

Authors: Sachin Kumar

Abstract:

Fuzzy fractional diffusion equation is widely useful to depict different physical processes arising in physics, biology, and hydrology. The motive of this article is to deal with the fuzzy fractional diffusion equation. We study a mathematical model of fuzzy space-time fractional diffusion equation in which unknown function, coefficients, and initial-boundary conditions are fuzzy numbers. First, we find out a fuzzy operational matrix of Legendre polynomial of Caputo type fuzzy fractional derivative having a non-singular Mittag-Leffler kernel. The main advantages of this method are that it reduces the fuzzy fractional partial differential equation (FFPDE) to a system of fuzzy algebraic equations from which we can find the solution of the problem. The feasibility of our approach is shown by some numerical examples. Hence, our method is suitable to deal with FFPDE and has good accuracy.

Keywords: fractional PDE, fuzzy valued function, diffusion equation, Legendre polynomial, spectral method

Procedia PDF Downloads 202
4459 Enhancement of Performance Utilizing Low Complexity Switched Beam Antenna

Authors: P. Chaipanya, R. Keawchai, W. Sombatsanongkhun, S. Jantaramporn

Abstract:

To manage the demand of wireless communication that has been dramatically increased, switched beam antenna in smart antenna system is focused. Implementation of switched beam antennas at mobile terminals such as notebook or mobile handset is a preferable choice to increase the performance of the wireless communication systems. This paper proposes the low complexity switched beam antenna using single element of antenna which is suitable to implement at mobile terminal. Main beam direction is switched by changing the positions of short circuit on the radiating patch. There are four cases of switching that provide four different directions of main beam. Moreover, the performance in terms of Signal to Interference Ratio when utilizing the proposed antenna is compared with the one using omni-directional antenna to confirm the performance improvable.

Keywords: switched beam, shorted circuit, single element, signal to interference ratio

Procedia PDF Downloads 172
4458 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement

Authors: Shibo Wei, Ting Jiang

Abstract:

Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).

Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR

Procedia PDF Downloads 202
4457 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults

Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura

Abstract:

The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.

Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing

Procedia PDF Downloads 284
4456 Valorization Cascade Approach of Fish By-Products towards a Zero-Waste Future: A Review

Authors: Joana Carvalho, Margarida Soares, André Ribeiro, Lucas Nascimento, Nádia Valério, Zlatina Genisheva

Abstract:

Following the exponential growth of human population, a remarkable increase in the amount of fish waste has been produced worldwide. The fish processing industry generates a considerable amount of by-products which represents a considerable environmental problem. Accordingly, the reuse and valorisation of these by-products is a key process for marine resource preservation. The significant volume of fish waste produced worldwide, along with its environmental impact, underscores the urgent need for the adoption of sustainable practices. The transformative potential of utilizing fish processing waste to create industrial value is gaining recognition. The substantial amounts of waste generated by the fish processing industry present both environmental challenges and economic inefficiencies. Different added-value products can be recovered by the valorisation industries, whereas fishing companies can save costs associated with the management of those wastes, with associated advantages, not only in terms of economic income but also considering the environmental impacts. Fish processing by-products have numerous applications; the target portfolio of products will be fish oil, fish protein hydrolysates, bacteriocins, pigments, vitamins, collagen, and calcium-rich powder, targeting food products, additives, supplements, and nutraceuticals. This literature review focuses on the main valorisation ways of fish wastes and different compounds with a high commercial value obtained by fish by-products and their possible applications in different fields. Highlighting its potential in sustainable resource management strategies can play and important role in reshaping the fish processing industry, driving it towards circular economy and consequently more sustainable future.

Keywords: fish process industry, fish wastes, by-products, circular economy, sustainability

Procedia PDF Downloads 20
4455 Text Based Shuffling Algorithm on Graphics Processing Unit for Digital Watermarking

Authors: Zayar Phyo, Ei Chaw Htoon

Abstract:

In a New-LSB based Steganography method, the Fisher-Yates algorithm is used to permute an existing array randomly. However, that algorithm performance became slower and occurred memory overflow problem while processing the large dimension of images. Therefore, the Text-Based Shuffling algorithm aimed to select only necessary pixels as hiding characters at the specific position of an image according to the length of the input text. In this paper, the enhanced text-based shuffling algorithm is presented with the powered of GPU to improve more excellent performance. The proposed algorithm employs the OpenCL Aparapi framework, along with XORShift Kernel including the Pseudo-Random Number Generator (PRNG) Kernel. PRNG is applied to produce random numbers inside the kernel of OpenCL. The experiment of the proposed algorithm is carried out by practicing GPU that it can perform faster-processing speed and better efficiency without getting the disruption of unnecessary operating system tasks.

Keywords: LSB based steganography, Fisher-Yates algorithm, text-based shuffling algorithm, OpenCL, XORShiftKernel

Procedia PDF Downloads 152
4454 Effect of Different Processing Methods on the Quality Attributes of Pigeon Pea Used in Bread Production

Authors: B. F. Olanipekun, O. J. Oyelade, C. O. Osemobor

Abstract:

Pigeon pea is a very good source of protein and micronutrient, but it is being underutilized in Nigeria because of several constraints. This research considered the effect of different processing methods on the quality attributes of pigeon pea used in bread production towards enhancing its utility. Pigeon pea was obtained at a local market and processed into the flour using three processing methods: soaking, sprouting and roasting and were used to bake bread in different proportions. Chemical composition and sensory attributes of the breads were thereafter determined. The highest values of protein and ash contents were obtained from 20 % substitution of sprouted pigeon pea in wheat flour and may be attributable to complex biochemical changes occurring during hydration, to invariably lead to protein constituent being broken down. Hydrolytic activities of the enzymes from the sprouted sample resulted in improvement in the constituent of total protein probably due to reduction in the carbohydrate content. Sensory qualities analyses showed that bread produced with soaked and roasted pigeon pea flours at 5 and 10% inclusion, respectively were mostly accepted than other blends, and products with sprouted pigeon pea flour were least accepted. The findings of this research suggest that supplementing wheat flour with sprouted pigeon peas have more nutritional potentials. However, with sensory analysis indices, the soaked and roasted pigeon peas up to 10% are majorly accepted, and also can improve the nutritional status. Overall, this will be very beneficial to population dependent on plant protein in order to combat malnutrition problems.

Keywords: pigeon pea, processing, protein, malnutrition

Procedia PDF Downloads 252
4453 Intelligent Chatbot Generating Dynamic Responses Through Natural Language Processing

Authors: Aarnav Singh, Jatin Moolchandani

Abstract:

The proposed research work aims to build a query-based AI chatbot that can answer any question related to any topic. A chatbot is software that converses with users via text messages. In the proposed system, we aim to build a chatbot that generates a response based on the user’s query. For this, we use natural language processing to analyze the query and some set of texts to form a concise answer. The texts are obtained through web-scrapping and filtering all the credible sources from a web search. The objective of this project is to provide a chatbot that is able to provide simple and accurate answers without the user having to read through a large number of articles and websites. Creating an AI chatbot that can answer a variety of user questions on a variety of topics is the goal of the proposed research project. This chatbot uses natural language processing to comprehend user inquiries and provides succinct responses by examining a collection of writings that were scraped from the internet. The texts are carefully selected from reliable websites that are found via internet searches. This project aims to provide users with a chatbot that provides clear and precise responses, removing the need to go through several articles and web pages in great detail. In addition to exploring the reasons for their broad acceptance and their usefulness across many industries, this article offers an overview of the interest in chatbots throughout the world.

Keywords: Chatbot, Artificial Intelligence, natural language processing, web scrapping

Procedia PDF Downloads 66
4452 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: semantic links, data mining, linked data, SKOS

Procedia PDF Downloads 181
4451 Material Detection by Phase Shift Cavity Ring-Down Spectroscopy

Authors: Rana Muhammad Armaghan Ayaz, Yigit Uysallı, Nima Bavili, Berna Morova, Alper Kiraz

Abstract:

Traditional optical methods for example resonance wavelength shift and cavity ring-down spectroscopy used for material detection and sensing have disadvantages, for example, less resistance to laser noise, temperature fluctuations and extraction of the required information can be a difficult task like ring downtime in case of cavity ring-down spectroscopy. Phase shift cavity ring down spectroscopy is not only easy to use but is also capable of overcoming the said problems. This technique compares the phase difference between the signal coming out of the cavity with the reference signal. Detection of any material is made by the phase difference between them. By using this technique, air, water, and isopropyl alcohol can be recognized easily. This Methodology has far-reaching applications and can be used in air pollution detection, human breath analysis and many more.

Keywords: materials, noise, phase shift, resonance wavelength, sensitivity, time domain approach

Procedia PDF Downloads 150
4450 Creative Potential of Children with Learning Disabilities

Authors: John McNamara

Abstract:

Growing up creative is an important idea in today’s classrooms. As education seeks to prepare children for their futures, it is important that the system considers traditional as well as non-traditional pathways. This poster describes the findings of a research study investigating creative potential in children with learning disabilities. Children with learning disabilities were administered the Torrance Test of Creative Problem Solving along with subtests from the Comprehensive Test of Phonological Processing. A quantitative comparative analysis was computed using paired-sample t-tests. Results indicated statistically significant difference between children’s creative problem-solving skills and their reading-based skills. The results lend support to the idea that children with learning disabilities have inherent strengths in the area of creativity. It can be hypothesized that the success of these children may be associated with the notion that they are using a type of neurological processing that is not otherwise used in academic tasks. Children with learning disabilities, a presumed left-side neurological processing problem, process information with the right side of the brain – even with tasks that should be processed with the left side (i.e. language). In over-using their right hemisphere, it is hypothesized that children with learning disabilities have well-developed right hemispheres and, as such, have strengths associated with this type of processing, such as innovation and creativity. The current study lends support to the notion that children with learning disabilities may be particularly primed to succeed in areas that call on creativity and creative thinking.

Keywords: learning disabilities, educational psychology, education, creativity

Procedia PDF Downloads 70
4449 Developing Manufacturing Process for the Graphene Sensors

Authors: Abdullah Faqihi, John Hedley

Abstract:

Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.

Keywords: laser scribing, lightscribe DVD, graphene oxide, scanning electron microscopy

Procedia PDF Downloads 122
4448 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps

Authors: Butta Singh

Abstract:

This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.

Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram

Procedia PDF Downloads 198
4447 Mindfulness, Reinvestment, and Rowing under Pressure: Evidence for Moderated Moderation of the Anxiety-Performance Relationship

Authors: Katherine Sparks, Christopher Ring

Abstract:

This study aimed to investigate whether dispositional sport-specific mindfulness moderated the moderation effect of conscious processing on the relationship between anxiety and rowing race performance. Using a sport-specific (Rowing-Specific) Reinvestment Scale (RSRS) to measure state conscious processing, we examined the effects of trait sport-related mindfulness on the conscious processes of rowers under competitive racing pressure at a number of UK regattas. 276 rowers completed a survey post competitive race. The survey included the RSRS, mindfulness, a perceived performance rating scale, demographic and race information to identify and record the rower’s actual race performance. Results from the research demonstrated that high levels of dispositional mindfulness are associated with a superior performance under pressure. In relation to the moderating moderation effect, conscious processing amplifies the detrimental effects of anxiety on performance. However, mindfulness, mindful awareness, and mindful non-judgement all proved to attenuate this amplification effect by moderating the conscious processing moderation on the anxiety-performance relationship. Therefore, this study provides initial support for the speculation that dispositional mindfulness can help prevent the deleterious effects of rowing-specific reinvestment under pressure.

Keywords: mindful, reinvestment, under pressure, performance, rowing

Procedia PDF Downloads 157
4446 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications

Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley

Abstract:

Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.

Keywords: batteries, energy, iron, nickel, storage

Procedia PDF Downloads 441
4445 Numerical Implementation and Testing of Fractioning Estimator Method for the Box-Counting Dimension of Fractal Objects

Authors: Abraham Terán Salcedo, Didier Samayoa Ochoa

Abstract:

This work presents a numerical implementation of a method for estimating the box-counting dimension of self-avoiding curves on a planar space, fractal objects captured on digital images; this method is named fractioning estimator. Classical methods of digital image processing, such as noise filtering, contrast manipulation, and thresholding, among others, are used in order to obtain binary images that are suitable for performing the necessary computations of the fractioning estimator. A user interface is developed for performing the image processing operations and testing the fractioning estimator on different captured images of real-life fractal objects. To analyze the results, the estimations obtained through the fractioning estimator are compared to the results obtained through other methods that are already implemented on different available software for computing and estimating the box-counting dimension.

Keywords: box-counting, digital image processing, fractal dimension, numerical method

Procedia PDF Downloads 83
4444 Quantitative Comparisons of Different Approaches for Rotor Identification

Authors: Elizabeth M. Annoni, Elena G. Tolkacheva

Abstract:

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.

Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors

Procedia PDF Downloads 324
4443 The Variable Sampling Interval Xbar Chart versus the Double Sampling Xbar Chart

Authors: Michael B. C. Khoo, J. L. Khoo, W. C. Yeong, W. L. Teoh

Abstract:

The Shewhart Xbar control chart is a useful process monitoring tool in manufacturing industries to detect the presence of assignable causes. However, it is insensitive in detecting small process shifts. To circumvent this problem, adaptive control charts are suggested. An adaptive chart enables at least one of the chart’s parameters to be adjusted to increase the chart’s sensitivity. Two common adaptive charts that exist in the literature are the double sampling (DS) Xbar and variable sampling interval (VSI) Xbar charts. This paper compares the performances of the DS and VSI Xbar charts, based on the average time to signal (ATS) criterion. The ATS profiles of the DS Xbar and VSI Xbar charts are obtained using the Mathematica and Statistical Analysis System (SAS) programs, respectively. The results show that the VSI Xbar chart is generally superior to the DS Xbar chart.

Keywords: adaptive charts, average time to signal, double sampling, charts, variable sampling interval

Procedia PDF Downloads 287