Search results for: transform Hough
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1472

Search results for: transform Hough

1262 Green Synthesis of Magnetic, Silica Nanocomposite and Its Adsorptive Performance against Organochlorine Pesticides

Authors: Waleed A. El-Said, Dina M. Fouad, Mohamed H. Aly, Mohamed A. El-Gahami

Abstract:

Green synthesis of nanomaterials has received increasing attention as an eco-friendly technology in materials science. Here, we have used two types of extractions from green tea leaf (i.e. total extraction and tannin extraction) as reducing agents for a rapid, simple and one step synthesis method of mesoporous silica nanoparticles (MSNPs)/iron oxide (Fe3O4) nanocomposite based on deposition of Fe3O4 onto MSNPs. MSNPs/Fe3O4 nanocomposite were characterized by X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy, energy dispersive X-ray, vibrating sample magnetometer, N2 adsorption, and high-resolution transmission electron microscopy. The average mesoporous silica particle diameter was found to be around 30 nm with high surface area (818 m2/gm). MSNPs/Fe3O4 nanocomposite was used for removing lindane pesticide (an environmental hazard material) from aqueous solutions. Fourier transform infrared, UV-vis, High-performance liquid chromatography and gas chromatography techniques were used to confirm the high ability of MSNPs/Fe3O4 nanocomposite for sensing and capture of lindane molecules with high sorption capacity (more than 89%) that could develop a new eco-friendly strategy for detection and removing of pesticide and as a promising material for water treatment application.

Keywords: green synthesis, mesoporous silica, magnetic iron oxide NPs, adsorption Lindane

Procedia PDF Downloads 404
1261 Globalization as Instrument for Multi-National Corporation in Transforming Asian’s Perspective towards Clean Water Consumption

Authors: Atanta Gian

Abstract:

It is inevitable that globalization has succeeded in transforming the world today. The influence of globalization has emerged in almost every aspect of life nowadays, especially in shaping the perception of the people. It can be seen on how easy for people are affected by the information surrounding them. Due to globalization, the flow of information has become more rapid along with the development of technology. People tend to believe in information that they actually get by themselves, if there is information where most of the people believe it is true, then this information could be categorized as factual and relevant. Therefore if people gain information on what is best for them in terms of daily consumption, then this information could transform their perspective, and it becomes a consideration in selecting their needs for daily consumption. By looking at this trend, the author sees that globalization could be used by Multi-National Corporation (MNC) to enhance the promotion of their products. This is applied by shaping the perspectives of the world regarding what is the best for them. Multi-National Corporation which has better technology in terms of the development of their external promotion could utilize this opportunity to affect people’s perspectives into what they want. In this paper, the author would like to elaborate how globalization is applied by MNC to shape people’s perspective regarding what is the best for them. The author would utilize a case study to analyze on how MNC could transform the perspectives of Asian people regarding the necessary of having a better quality drinking water, which in this case, MNC has shaped the perspective of Asian people in choosing their product by promoting the bottled water as the best choice for them. In the end of this paper, author would come to a conclusion that MNCs are able to shape the world’s perspective regarding the needs of their products which is supported by the globalization that is happening now.

Keywords: consumption, globalisation, influence, information technology, multi-national corporations

Procedia PDF Downloads 182
1260 Impact of Zeolite NaY Synthesized from Kaolin on the Properties of Pyrolytic Oil Derived from Used Tire

Authors: Julius Ilawe Osayi, Peter Osifo

Abstract:

Solid waste disposal, such as used tires is a global challenge as well as energy crisis due to rising energy demand amidst price uncertainty and depleting fossil fuel reserves. Therefore, the effectiveness of pyrolysis as a disposal method that can transform used tires into liquid fuel and other end-products has made the process attractive to researchers. Although used tires have been converted to liquid fuel using pyrolysis, there is the need to improve on the liquid fuel properties. Hence, this paper reports the investigation of zeolite NaY synthesized from kaolin, a locally abundant soil material in the Benin metropolis as a suitable catalyst and its effect on the properties of pyrolytic oil produced from used tires. The pyrolysis process was conducted for a range of 1 to 10 wt.% of catalyst concentration to used tire at a temperature of 600 oC, a heating rate of 15oC/min and particle size of 6mm. Although no significant increase in pyrolytic oil yield was observed compared to the previously investigated non-catalytic pyrolysis of a used tire. However, the Fourier transform infrared (FTIR), Nuclear Magnetic Resonance (NMR); and Gas chromatography-mass spectrometry (GC-MS) characterization results revealed the pyrolytic oil to possess an improved physicochemical and fuel properties alongside valuable industrial chemical species. This confirms the possibility of transforming kaolin into a catalyst suitable for improved fuel properties of the liquid fraction obtainable from thermal cracking of hydrocarbon materials.

Keywords: catalytic pyrolysis, fossil fuel, kaolin, pyrolytic oil, used tyres, Zeolite NaY

Procedia PDF Downloads 152
1259 Computer-Aided Detection of Simultaneous Abdominal Organ CT Images by Iterative Watershed Transform

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

Interpretation of medical images benefits from anatomical and physiological priors to optimize computer-aided diagnosis applications. Segmentation of liver, spleen and kidneys is regarded as a major primary step in the computer-aided diagnosis of abdominal organ diseases. In this paper, a semi-automated method for medical image data is presented for the abdominal organ segmentation data using mathematical morphology. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. Our algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter, we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.

Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, simultaneous organ segmentation, the watershed algorithm

Procedia PDF Downloads 413
1258 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 56
1257 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 463
1256 Traditional Industries Innovation and Brand Value Analysis in Taiwan: Case Study of a Certain Plastic Company

Authors: Ju Shan Lin

Abstract:

The challenges for traditional industries in Taiwan the past few years are the changes of overall domestic and foreign industry structure, the entrepreneurs not only need to keep on improving their profession skills but also continuously research and develop new products. It is also necessary for the all traditional industries to keep updating the business strategy, let the enterprises continue to progress, and won't be easily replaced by the other industries. The traditional industry in Taiwan attach great importance to the field of enterprises upgrading and innovation in recent years, by the enterprise innovation and transformation can enhance the overall business situation also enable them to obtain more additional profits than in the past. Except the original industry structure's need to transform and upgrade, the brand's business and marketing strategy are also essential. This study will take a certain plastic company as case analysis, for the brand promotion of traditional industries, brand values and business innovation model for further exploration. It will also be mentioned that the other traditional industries cases which were already achieved success on the enterprise's upgrading and innovation, at the same time, the difficulties which they faced with and the way they overcome will be explored as well. This study will use the case study method combined with expert interviews to discuss and analyze this certain plastic company's current business situation, the existing products and the possible trends in the future. Looking forward to providing an innovative business model that will enable this plastic company to upgrade its corporate image and the brand could transform successfully.

Keywords: brand marketing strategy, enterprise upgrade, industrial transformation, traditional industry

Procedia PDF Downloads 213
1255 Tracking of Linarin from the Ethyl Acetate Fraction of Melinjo (Gnetum gnemon L.) Seeds Using Preparative High Performance Liquid Chromatography

Authors: Asep Sukohar, Ramadhan Triyandi, Muhammad Iqbal, Sahidin, Suharyani

Abstract:

Introduction: Resveratrol is a class of bioactive chemicals found in melinjo, which has a wide range of biological actions. The purpose of this study is to determine the linarin content of the melinjo fraksi by using preparative-high-performance liquid chromatography (prep-HPLC). Method: Extraction used the soxhletation method with 96% ethanol solvent. Fractionation used ethyl acetate and ethanol in a ratio of 1:1. Tracing of linarin compound used prep-HPLC with a mobile phase ratio of distilled water: methanol (55: 45, v/v). The presence of linarin was detected using a wavelength of 215 nm. Fourier Transform Infrared (FTIR) was used to identify the functional groups of compound. Result: The retention time required to elute the ethyl acetate fraction was 2.601 minutes. Compound separation identification using Fourier Transform Infrared Spectroscopy - Quest Attenuated Total Reflectance (FTIR - QATR) has a similarity value range with standards from 0 to 1000. The elution results of the ethyl acetate fraction have similar values with the standard compounds linarin (668), resveratrol (578), and catechin (455). Conclusion: Tracing for active compound in the ethyl acetate fraction of Gnetum Gnemon L. using prep-HPLC showed a strong suspicion of the presence of linarin compound.

Keywords: Gnetum gnemon L., linarin, prep-HPLC, fraction ethyl acetate

Procedia PDF Downloads 58
1254 The Mechanism Study of Degradative Solvent Extraction of Biomass by Liquid Membrane-Fourier Transform Infrared Spectroscopy

Authors: W. Ketren, J. Wannapeera, Z. Heishun, A. Ryuichi, K. Toshiteru, M. Kouichi, O. Hideaki

Abstract:

Degradative solvent extraction is the method developed for biomass upgrading by dewatering and fractionation of biomass under the mild condition. However, the conversion mechanism of the degradative solvent extraction method has not been fully understood so far. The rice straw was treated in 1-methylnaphthalene (1-MN) at a different solvent-treatment temperature varied from 250 to 350 oC with the residence time for 60 min. The liquid membrane-Fourier Transform Infrared Spectroscopy (FTIR) technique is applied to study the processing mechanism in-depth without separation of the solvent. It has been found that the strength of the oxygen-hydrogen stretching  (3600-3100 cm-1) decreased slightly with increasing temperature in the range of 300-350 oC. The decrease of the hydroxyl group in the solvent soluble suggested dehydration reaction taking place between 300 and 350 oC. FTIR spectra in the carbonyl stretching region (1800-1600 cm-1) revealed the presence of esters groups, carboxylic acid and ketonic groups in the solvent-soluble of biomass. The carboxylic acid increased in the range of 200 to 250 oC and then decreased. The prevailing of aromatic groups showed that the aromatization took place during extraction at above 250 oC. From 300 to 350 oC, the carbonyl functional groups in the solvent-soluble noticeably decreased. The removal of the carboxylic acid and the decrease of esters into the form of carbon dioxide indicated that the decarboxylation reaction occurred during the extraction process.

Keywords: biomass waste, degradative solvent extraction, mechanism, upgrading

Procedia PDF Downloads 255
1253 Diagnosis of Induction Machine Faults by DWT

Authors: Hamidreza Akbari

Abstract:

In this paper, for detection of inclined eccentricity in an induction motor, time–frequency analysis of the stator startup current is carried out. For this purpose, the discrete wavelet transform is used. Data are obtained from simulations, using winding function approach. The results show the validity of the approach for detecting the fault and discriminating with respect to other faults.

Keywords: induction machine, fault, DWT, electric

Procedia PDF Downloads 317
1252 Soil Macronutrients Sensing for Precision Agriculture Purpose Using Fourier Transform Infrared Spectroscopy

Authors: Hossein Navid, Maryam Adeli Khadem, Shahin Oustan, Mahmoud Zareie

Abstract:

Among the nutrients needed by the plants, three elements containing nitrate, phosphorus and potassium are more important. The objective of this research was measuring these nutrient amounts in soil using Fourier transform infrared spectroscopy in range of 400- 4000 cm-1. Soil samples for different soil types (sandy, clay and loam) were collected from different areas of East Azerbaijan. Three types of fertilizers in conventional farming (urea, triple superphosphate, potassium sulphate) were used for soil treatment. Each specimen was divided into two categories: The first group was used in the laboratory (direct measurement) to extract nitrate, phosphorus and potassium uptake by colorimetric method of Olsen and ammonium acetate. The second group was used to measure drug absorption spectrometry. In spectrometry, the small amount of soil samples mixed with KBr and was taken in a small pill form. For the tests, the pills were put in the center of infrared spectrometer and graphs were obtained. Analysis of data was done using MINITAB and PLSR software. The data obtained from spectrometry method were compared with amount of soil nutrients obtained from direct drug absorption using EXCEL software. There were good fitting between these two data series. For nitrate, phosphorus and potassium R2 was 79.5%, 92.0% and 81.9%, respectively. Also, results showed that the range of MIR (mid-infrared) is appropriate for determine the amount of soil nitrate and potassium and can be used in future research to obtain detailed maps of land in agricultural use.

Keywords: nitrate, phosphorus, potassium, soil nutrients, spectroscopy

Procedia PDF Downloads 365
1251 Realization of Hybrid Beams Inertial Amplifier

Authors: Somya Ranjan Patro, Abhigna Bhatt, Arnab Banerjee

Abstract:

Inertial amplifier has recently gained increasing attention as a new mechanism for vibration control of structures. Currently, theoretical investigations are undertaken by researchers to reveal its fundamentals and to understand its underline principles in altering the structural response of structures against dynamic loadings. This paper investigates experimental and analytical studies on the dynamic characteristics of hybrid beam inertial amplifier (HBIA). The analytical formulation of the HBIA has been derived by implementing the spectral element method and rigid body dynamics. This formulation gives the relation between dynamic force and the response of the structure in the frequency domain. Further, for validation of the proposed HBIA, the experiments have been performed. The experimental setup consists of a 3D printed HBIA of polylactic acid (PLA) material screwed at the base plate of the shaker system. Two numbers of accelerometers are used to study the response, one at the base plate of the shaker second one placed at the top of the inertial amplifier. A force transducer is also placed in between the base plate and the inertial amplifier to calculate the total amount of load transferred from the base plate to the inertial amplifier. The obtained time domain response from the accelerometers have been converted into the frequency domain using the Fast Fourier Transform (FFT) algorithm. The experimental transmittance values are successfully validated with the analytical results, providing us essential confidence in our proposed methodology.

Keywords: inertial amplifier, fast fourier transform, natural frequencies, polylactic acid, transmittance, vibration absorbers

Procedia PDF Downloads 68
1250 Fault Analysis of Induction Machine Using Finite Element Method (FEM)

Authors: Wiem Zaabi, Yemna Bensalem, Hafedh Trabelsi

Abstract:

The paper presents a finite element (FE) based efficient analysis procedure for induction machine (IM). The FE formulation approaches are proposed to achieve this goal: the magnetostatic and the non-linear transient time stepped formulations. The study based on finite element models offers much more information on the phenomena characterizing the operation of electrical machines than the classical analytical models. This explains the increase of the interest for the finite element investigations in electrical machines. Based on finite element models, this paper studies the influence of the stator and the rotor faults on the behavior of the IM. In this work, a simple dynamic model for an IM with inter-turn winding fault and a broken bar fault is presented. This fault model is used to study the IM under various fault conditions and severity. The simulation results are conducted to validate the fault model for different levels of fault severity. The comparison of the results obtained by simulation tests allowed verifying the precision of the proposed FEM model. This paper presents a technical method based on Fast Fourier Transform (FFT) analysis of stator current and electromagnetic torque to detect the faults of broken rotor bar. The technique used and the obtained results show clearly the possibility of extracting signatures to detect and locate faults.

Keywords: Finite element Method (FEM), Induction motor (IM), short-circuit fault, broken rotor bar, Fast Fourier Transform (FFT) analysis

Procedia PDF Downloads 268
1249 DWT-SATS Based Detection of Image Region Cloning

Authors: Michael Zimba

Abstract:

A duplicated image region may be subjected to a number of attacks such as noise addition, compression, reflection, rotation, and scaling with the intention of either merely mating it to its targeted neighborhood or preventing its detection. In this paper, we present an effective and robust method of detecting duplicated regions inclusive of those affected by the various attacks. In order to reduce the dimension of the image, the proposed algorithm firstly performs discrete wavelet transform, DWT, of a suspicious image. However, unlike most existing copy move image forgery (CMIF) detection algorithms operating in the DWT domain which extract only the low frequency sub-band of the DWT of the suspicious image thereby leaving valuable information in the other three sub-bands, the proposed algorithm simultaneously extracts features from all the four sub-bands. The extracted features are not only more accurate representation of image regions but also robust to additive noise, JPEG compression, and affine transformation. Furthermore, principal component analysis-eigenvalue decomposition, PCA-EVD, is applied to reduce the dimension of the features. The extracted features are then sorted using the more computationally efficient Radix Sort algorithm. Finally, same affine transformation selection, SATS, a duplication verification method, is applied to detect duplicated regions. The proposed algorithm is not only fast but also more robust to attacks compared to the related CMIF detection algorithms. The experimental results show high detection rates.

Keywords: affine transformation, discrete wavelet transform, radix sort, SATS

Procedia PDF Downloads 201
1248 Hindi Speech Synthesis by Concatenation of Recognized Hand Written Devnagri Script Using Support Vector Machines Classifier

Authors: Saurabh Farkya, Govinda Surampudi

Abstract:

Optical Character Recognition is one of the current major research areas. This paper is focussed on recognition of Devanagari script and its sound generation. This Paper consists of two parts. First, Optical Character Recognition of Devnagari handwritten Script. Second, speech synthesis of the recognized text. This paper shows an implementation of support vector machines for the purpose of Devnagari Script recognition. The Support Vector Machines was trained with Multi Domain features; Transform Domain and Spatial Domain or Structural Domain feature. Transform Domain includes the wavelet feature of the character. Structural Domain consists of Distance Profile feature and Gradient feature. The Segmentation of the text document has been done in 3 levels-Line Segmentation, Word Segmentation, and Character Segmentation. The pre-processing of the characters has been done with the help of various Morphological operations-Otsu's Algorithm, Erosion, Dilation, Filtration and Thinning techniques. The Algorithm was tested on the self-prepared database, a collection of various handwriting. Further, Unicode was used to convert recognized Devnagari text into understandable computer document. The document so obtained is an array of codes which was used to generate digitized text and to synthesize Hindi speech. Phonemes from the self-prepared database were used to generate the speech of the scanned document using concatenation technique.

Keywords: Character Recognition (OCR), Text to Speech (TTS), Support Vector Machines (SVM), Library of Support Vector Machines (LIBSVM)

Procedia PDF Downloads 468
1247 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases

Authors: Mohammad A. Bani-Khaled

Abstract:

In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.

Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams

Procedia PDF Downloads 393
1246 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 50
1245 Method to Find a ε-Optimal Control of Stochastic Differential Equation Driven by a Brownian Motion

Authors: Francys Souza, Alberto Ohashi, Dorival Leao

Abstract:

We present a general solution for finding the ε-optimal controls for non-Markovian stochastic systems as stochastic differential equations driven by Brownian motion, which is a problem recognized as a difficult solution. The contribution appears in the development of mathematical tools to deal with modeling and control of non-Markovian systems, whose applicability in different areas is well known. The methodology used consists to discretize the problem through a random discretization. In this way, we transform an infinite dimensional problem in a finite dimensional, thereafter we use measurable selection arguments, to find a control on an explicit form for the discretized problem. Then, we prove the control found for the discretized problem is a ε-optimal control for the original problem. Our theory provides a concrete description of a rather general class, among the principals, we can highlight financial problems such as portfolio control, hedging, super-hedging, pairs-trading and others. Therefore, our main contribution is the development of a tool to explicitly the ε-optimal control for non-Markovian stochastic systems. The pathwise analysis was made through a random discretization jointly with measurable selection arguments, has provided us with a structure to transform an infinite dimensional problem into a finite dimensional. The theory is applied to stochastic control problems based on path-dependent stochastic differential equations, where both drift and diffusion components are controlled. We are able to explicitly show optimal control with our method.

Keywords: dynamic programming equation, optimal control, stochastic control, stochastic differential equation

Procedia PDF Downloads 145
1244 Niftiness of the COLME to Promote Shared Decision-Making in Organizations

Authors: Prakash Singh

Abstract:

The question that arises is whether a theory such as the Collegial Leadership Model of Emancipation (COLME) has the potency to introduce leadership change by empowering and emancipating their employees. It is a fallacy to simply assume that experience alone, in the absence of theory, will contribute to this knowledge base to develop collegial leaders. The focus of this study is to therefore ascertain whether the COLME can serve as a conceptual framework to transform traditional bureaucratic management practices (TBMPs) in order to promote shared decision-making in organizations such as schools. All the respondents in this exploratory qualitative study embraced collegiality to transform TBMPs in their organizations. For the positive effects to be sustained, the collegial practices need to be evolutionary and emancipatory in order to evoke the values of collegial leadership as elucidated by the findings of this study. Interviewees affirmed that the COLME provides an astute framework to develop commendable collegial leadership practices as it clearly outlines procedures to develop and use the leadership potential of all the employees in order to foster joint accountability. They acknowledged that when the principles of collegiality are flexibly applied, they contribute to the creation of a holistic milieu in which all employees are able to express themselves freely, without fear of failure, and thus feel that they are part of the democratic decision-making process. Evidently, a conceptual framework such as the COLME can serve as a benchmark for leadership effectiveness because organizational outcomes need to be measured against standards of excellence in meeting both employee and customer expectations.

Keywords: collegial leadership model, employee empowerment, shared decision-making, traditional bureaucratic management practices

Procedia PDF Downloads 455
1243 A Neurofeedback Learning Model Using Time-Frequency Analysis for Volleyball Performance Enhancement

Authors: Hamed Yousefi, Farnaz Mohammadi, Niloufar Mirian, Navid Amini

Abstract:

Investigating possible capacities of visual functions where adapted mechanisms can enhance the capability of sports trainees is a promising area of research, not only from the cognitive viewpoint but also in terms of unlimited applications in sports training. In this paper, the visual evoked potential (VEP) and event-related potential (ERP) signals of amateur and trained volleyball players in a pilot study were processed. Two groups of amateur and trained subjects are asked to imagine themselves in the state of receiving a ball while they are shown a simulated volleyball field. The proposed method is based on a set of time-frequency features using algorithms such as Gabor filter, continuous wavelet transform, and a multi-stage wavelet decomposition that are extracted from VEP signals that can be indicative of being amateur or trained. The linear discriminant classifier achieves the accuracy, sensitivity, and specificity of 100% when the average of the repetitions of the signal corresponding to the task is used. The main purpose of this study is to investigate the feasibility of a fast, robust, and reliable feature/model determination as a neurofeedback parameter to be utilized for improving the volleyball players’ performance. The proposed measure has potential applications in brain-computer interface technology where a real-time biomarker is needed.

Keywords: visual evoked potential, time-frequency feature extraction, short-time Fourier transform, event-related spectrum potential classification, linear discriminant analysis

Procedia PDF Downloads 105
1242 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission

Authors: Tingwei Shu, Dong Zhou, Chengjun Guo

Abstract:

Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.

Keywords: semantic communication, transformer, wavelet transform, data processing

Procedia PDF Downloads 47
1241 Optimal Image Representation for Linear Canonical Transform Multiplexing

Authors: Navdeep Goel, Salvador Gabarda

Abstract:

Digital images are widely used in computer applications. To store or transmit the uncompressed images requires considerable storage capacity and transmission bandwidth. Image compression is a means to perform transmission or storage of visual data in the most economical way. This paper explains about how images can be encoded to be transmitted in a multiplexing time-frequency domain channel. Multiplexing involves packing signals together whose representations are compact in the working domain. In order to optimize transmission resources each 4x4 pixel block of the image is transformed by a suitable polynomial approximation, into a minimal number of coefficients. Less than 4*4 coefficients in one block spares a significant amount of transmitted information, but some information is lost. Different approximations for image transformation have been evaluated as polynomial representation (Vandermonde matrix), least squares + gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev polynomials or singular value decomposition (SVD). Results have been compared in terms of nominal compression rate (NCR), compression ratio (CR) and peak signal-to-noise ratio (PSNR) in order to minimize the error function defined as the difference between the original pixel gray levels and the approximated polynomial output. Polynomial coefficients have been later encoded and handled for generating chirps in a target rate of about two chirps per 4*4 pixel block and then submitted to a transmission multiplexing operation in the time-frequency domain.

Keywords: chirp signals, image multiplexing, image transformation, linear canonical transform, polynomial approximation

Procedia PDF Downloads 392
1240 Preceramic Polymers Formulations for Potential Additive Manufacturing

Authors: Saja M. Nabat Al-Ajrash, Charles Browning, Rose Eckerle, Li Cao

Abstract:

Three preceramic polymer formulations for potential use in 3D printing technologies were investigated. The polymeric precursors include an allyl hydrido polycarbosilane (SMP-10), SMP-10/1,6-dexanediol diacrylate (HDDA) mixture, and polydimethylsiloxane (PDMS). The rheological property of the polymeric precursors, including the viscosity within a wide shear rate range was compared to determine the applicability in additive manufacturing technology. The structural properties of the polymeric solutions and their photocureability were investigated using Fourier transform infrared spectroscopy (FTIR) and differential scanning calorimetry (DSC). Moreover, thermogravimetric analysis (TGA) and X-ray diffraction (XRD) were utilized to study polymeric to ceramic conversion for versatile precursors. The prepared precursor resin proved to have outstanding photo-curing properties and the ability to transform to the silicon carbide phase at temperatures as low as 850 °C. The obtained ceramic was fully dense with nearly linear shrinkage and a shiny, smooth surface after pyrolysis. Furthermore, after pyrolysis to 1350 °C and TGA analysis, PDMS polymer showed the highest onset decomposition temperature and the lowest retained weight (52 wt%), while SMP.10/HDDA showed the lowest onset temperature and ceramic yield (71.7 wt%). In terms of crystallography, the ceramic matrix composite appeared to have three coexisting phases, including silicon carbide, and silicon oxycarbide. The results are very promising to fabricate ceramic materials working at high temperatures with complex geometries.

Keywords: preceramic polymer, silicon carbide, photocuring, allyl hydrido polycarbosilane, SMP-10

Procedia PDF Downloads 97
1239 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 457
1238 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication

Authors: Vedant Janapaty

Abstract:

Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.

Keywords: estuary, remote sensing, machine learning, Fourier transform

Procedia PDF Downloads 66
1237 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform

Authors: Omaima N. Ahmad AL-Allaf

Abstract:

Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.

Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform

Procedia PDF Downloads 191
1236 Weyl Type Theorem and the Fuglede Property

Authors: M. H. M. Rashid

Abstract:

Given H a Hilbert space and B(H) the algebra of bounded linear operator in H, let δAB denote the generalized derivation defined by A and B. The main objective of this article is to study Weyl type theorems for generalized derivation for (A,B) satisfying a couple of Fuglede.

Keywords: Fuglede Property, Weyl’s theorem, generalized derivation, Aluthge transform

Procedia PDF Downloads 107
1235 A Quantitative Study on the Effects of School Development on Character Development

Authors: Merve Gücen

Abstract:

One of the aims of education is to educate individuals who have embraced universal moral principles and transform universal moral principles into moral values. Character education aims to educate behaviors of individuals in their mental activities to transform moral principles into moral values in their lives. As the result of this education, individuals are expected to develop positive character traits and become morally indifferent individuals. What are the characteristics of the factors that influence character education at this stage? How should character education help individuals develop positive character traits? Which methods are more effective? These questions come to mind when studying character education. Our research was developed within the framework of these questions. The aim of our study is to provide the most effective use of the education factor that affects character. In this context, we tried to explain character definition, character development, character education and the factors affecting character education using qualitative research methods. At this stage, character education programs applied in various countries were examined and a character education program consisting of Islamic values was prepared and implemented in an International Imam Hatip High School in Istanbul. Our application was carried out with the collaboration of school and families. Various seminars were organized in the school and participation of families was ensured. In the last phase of our study, we worked with the students and their families on the effectiveness of the events held during the program. In this study, it was found that activities such as storytelling and theater in character education programs were effective in recognizing wrong behaviors in individuals. It was determined that our program had a positive effect on the quality of education. It was seen that applications of this educational program affected the behavior of the employees in the educational institution.

Keywords: character development, family activities, values education, education program

Procedia PDF Downloads 144
1234 Precoding-Assisted Frequency Division Multiple Access Transmission Scheme: A Cyclic Prefixes- Available Modulation-Based Filter Bank Multi-Carrier Technique

Authors: Ying Wang, Jianhong Xiang, Yu Zhong

Abstract:

The offset Quadrature Amplitude Modulation-based Filter Bank Multi-Carrier (FBMC) system provides superior spectral properties over Orthogonal Frequency Division Multiplexing. However, seriously affected by imaginary interference, its performances are hampered in many areas. In this paper, we propose a Precoding-Assisted Frequency Division Multiple Access (PA-FDMA) modulation scheme. By spreading FBMC symbols into the frequency domain and transmitting them with a precoding matrix, the impact of imaginary interference can be eliminated. Specifically, we first generate the coding pre-solution matrix with a nonuniform Fast Fourier Transform and pick the best columns by introducing auxiliary factors. Secondly, according to the column indexes, we obtain the precoding matrix for one symbol and impose scaling factors to ensure that the power is approximately constant throughout the transmission time. Finally, we map the precoding matrix of one symbol to multiple symbols and transmit multiple data frames, thus achieving frequency-division multiple access. Additionally, observing the interference between adjacent frames, we mitigate them by adding frequency Cyclic Prefixes (CP) and evaluating them with a signal-to-interference ratio. Note that PA-FDMA can be considered a CP-available FBMC technique because the underlying strategy is FBMC. Simulation results show that the proposed scheme has better performance compared to Single Carrier Frequency Division Multiple Access (SC-FDMA), etc.

Keywords: PA-FDMA, SC-FDMA, FBMC, non-uniform fast fourier transform

Procedia PDF Downloads 22
1233 Development and Validation of a Green Analytical Method for the Analysis of Daptomycin Injectable by Fourier-Transform Infrared Spectroscopy (FTIR)

Authors: Eliane G. Tótoli, Hérida Regina N. Salgado

Abstract:

Daptomycin is an important antimicrobial agent used in clinical practice nowadays, since it is very active against some Gram-positive bacteria that are particularly challenges for the medicine, such as methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococci (VRE). The importance of environmental preservation has receiving special attention since last years. Considering the evident need to protect the natural environment and the introduction of strict quality requirements regarding analytical procedures used in pharmaceutical analysis, the industries must seek environmentally friendly alternatives in relation to the analytical methods and other processes that they follow in their routine. In view of these factors, green analytical chemistry is prevalent and encouraged nowadays. In this context, infrared spectroscopy stands out. This is a method that does not use organic solvents and, although it is formally accepted for the identification of individual compounds, also allows the quantitation of substances. Considering that there are few green analytical methods described in literature for the analysis of daptomycin, the aim of this work was the development and validation of a green analytical method for the quantification of this drug in lyophilized powder for injectable solution, by Fourier-transform infrared spectroscopy (FT-IR). Method: Translucent potassium bromide pellets containing predetermined amounts of the drug were prepared and subjected to spectrophotometric analysis in the mid-infrared region. After obtaining the infrared spectrum and with the assistance of the IR Solution software, quantitative analysis was carried out in the spectral region between 1575 and 1700 cm-1, related to a carbonyl band of the daptomycin molecule, and this band had its height analyzed in terms of absorbance. The method was validated according to ICH guidelines regarding linearity, precision (repeatability and intermediate precision), accuracy and robustness. Results and discussion: The method showed to be linear (r = 0.9999), precise (RSD% < 2.0), accurate and robust, over a concentration range from 0.2 to 0.6 mg/pellet. In addition, this technique does not use organic solvents, which is one great advantage over the most common analytical methods. This fact contributes to minimize the generation of organic solvent waste by the industry and thereby reduces the impact of its activities on the environment. Conclusion: The validated method proved to be adequate to quantify daptomycin in lyophilized powder for injectable solution and can be used for its routine analysis in quality control. In addition, the proposed method is environmentally friendly, which is in line with the global trend.

Keywords: daptomycin, Fourier-transform infrared spectroscopy, green analytical chemistry, quality control, spectrometry in IR region

Procedia PDF Downloads 355