Search results for: inverse laplace transform techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8161

Search results for: inverse laplace transform techniques

7981 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 89
7980 Objective Evaluation on Medical Image Compression Using Wavelet Transformation

Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah

Abstract:

The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.

Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation

Procedia PDF Downloads 268
7979 Fault Detection of Pipeline in Water Distribution Network System

Authors: Shin Je Lee, Go Bong Choi, Jeong Cheol Seo, Jong Min Lee, Gibaek Lee

Abstract:

Water pipe network is installed underground and once equipped; it is difficult to recognize the state of pipes when the leak or burst happens. Accordingly, post management is often delayed after the fault occurs. Therefore, the systematic fault management system of water pipe network is required to prevent the accident and minimize the loss. In this work, we develop online fault detection system of water pipe network using data of pipes such as flow rate or pressure. The transient model describing water flow in pipelines is presented and simulated using Matlab. The fault situations such as the leak or burst can be also simulated and flow rate or pressure data when the fault happens are collected. Faults are detected using statistical methods of fast Fourier transform and discrete wavelet transform, and they are compared to find which method shows the better fault detection performance.

Keywords: fault detection, water pipeline model, fast Fourier transform, discrete wavelet transform

Procedia PDF Downloads 482
7978 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm

Procedia PDF Downloads 309
7977 Off-Grid Sparse Inverse Synthetic Aperture Imaging by Basis Shift Algorithm

Authors: Mengjun Yang, Zhulin Zong, Jie Gao

Abstract:

In this paper, a new and robust algorithm is proposed to achieve high resolution for inverse synthetic aperture radar (ISAR) imaging in the compressive sensing (CS) framework. Traditional CS based methods have to assume that unknown scatters exactly lie on the pre-divided grids; otherwise, their reconstruction performance dropped significantly. In this processing algorithm, several basis shifts are utilized to achieve the same effect as grid refinement does. The detailed implementation of the basis shift algorithm is presented in this paper. From the simulation we can see that using the basis shift algorithm, imaging precision can be improved. The effectiveness and feasibility of the proposed method are investigated by the simulation results.

Keywords: ISAR imaging, sparse reconstruction, off-grid, basis shift

Procedia PDF Downloads 241
7976 An Improved Sub-Nyquist Sampling Jamming Method for Deceiving Inverse Synthetic Aperture Radar

Authors: Yanli Qi, Ning Lv, Jing Li

Abstract:

Sub-Nyquist sampling jamming method (SNSJ) is a well known deception jamming method for inverse synthetic aperture radar (ISAR). However, the anti-decoy of the SNSJ method performs easier since the amplitude of the false-target images are weaker than the real-target image; the false-target images always lag behind the real-target image, and all targets are located in the same cross-range. In order to overcome the drawbacks mentioned above, a simple modulation based on SNSJ (M-SNSJ) is presented in this paper. The method first uses amplitude modulation factor to make the amplitude of the false-target images consistent with the real-target image, then uses the down-range modulation factor and cross-range modulation factor to make the false-target images move freely in down-range and cross-range, respectively, thus the capacity of deception is improved. Finally, the simulation results on the six available combinations of three modulation factors are given to illustrate our conclusion.

Keywords: inverse synthetic aperture radar (ISAR), deceptive jamming, Sub-Nyquist sampling jamming method (SNSJ), modulation based on Sub-Nyquist sampling jamming method (M-SNSJ)

Procedia PDF Downloads 188
7975 Reductive Control in the Management of Redundant Actuation

Authors: Mkhinini Maher, Knani Jilani

Abstract:

We present in this work the performances of a mobile omnidirectional robot through evaluating its management of the redundancy of actuation. Thus we come to the predictive control implemented. The distribution of the wringer on the robot actions, through the inverse pseudo of Moore-Penrose, corresponds to a -geometric- distribution of efforts. We will show that the load on vehicle wheels would not be equi-distributed in terms of wheels configuration and of robot movement. Thus, the threshold of sliding is not the same for the three wheels of the vehicle. We suggest exploiting the redundancy of actuation to reduce the risk of wheels sliding and to ameliorate, thereby, its accuracy of displacement. This kind of approach was the subject of study for the legged robots.

Keywords: mobile robot, actuation, redundancy, omnidirectional, inverse pseudo moore-penrose, reductive control

Procedia PDF Downloads 484
7974 Contourlet Transform and Local Binary Pattern Based Feature Extraction for Bleeding Detection in Endoscopic Images

Authors: Mekha Mathew, Varun P Gopi

Abstract:

Wireless Capsule Endoscopy (WCE) has become a great device in Gastrointestinal (GI) tract diagnosis, which can examine the entire GI tract, especially the small intestine without invasiveness and sedation. Bleeding in the digestive tract is a symptom of a disease rather than a disease itself. Hence the detection of bleeding is important in diagnosing many diseases. In this paper we proposes a novel method for distinguishing bleeding regions from normal regions based on Contourlet transform and Local Binary Pattern (LBP). Experiments show that this method provides a high accuracy rate of 96.38% in CIE XYZ colour space for k-Nearest Neighbour (k-NN) classifier.

Keywords: Wireless Capsule Endoscopy, local binary pattern, k-NN classifier, contourlet transform

Procedia PDF Downloads 466
7973 Development of Non-Intrusive Speech Evaluation Measure Using S-Transform and Light-Gbm

Authors: Tusar Kanti Dash, Ganapati Panda

Abstract:

The evaluation of speech quality and intelligence is critical to the overall effectiveness of the Speech Enhancement Algorithms. Several intrusive and non-intrusive measures are employed to calculate these parameters. Non-Intrusive Evaluation is most challenging as, very often, the reference clean speech data is not available. In this paper, a novel non-intrusive speech evaluation measure is proposed using audio features derived from the Stockwell transform. These features are used with the Light Gradient Boosting Machine for the effective prediction of speech quality and intelligibility. The proposed model is analyzed using noisy and reverberant speech from four databases, and the results are compared with the standard Intrusive Evaluation Measures. It is observed from the comparative analysis that the proposed model is performing better than the standard Non-Intrusive models.

Keywords: non-Intrusive speech evaluation, S-transform, light GBM, speech quality, and intelligibility

Procedia PDF Downloads 234
7972 The Non-Linear Analysis of Brain Response to Visual Stimuli

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.

Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 531
7971 A Study on the Solutions of the 2-Dimensional and Forth-Order Partial Differential Equations

Authors: O. Acan, Y. Keskin

Abstract:

In this study, we will carry out a comparative study between the reduced differential transform method, the adomian decomposition method, the variational iteration method and the homotopy analysis method. These methods are used in many fields of engineering. This is been achieved by handling a kind of 2-Dimensional and forth-order partial differential equations called the Kuramoto–Sivashinsky equations. Three numerical examples have also been carried out to validate and demonstrate efficiency of the four methods. Furthermost, it is shown that the reduced differential transform method has advantage over other methods. This method is very effective and simple and could be applied for nonlinear problems which used in engineering.

Keywords: reduced differential transform method, adomian decomposition method, variational iteration method, homotopy analysis method

Procedia PDF Downloads 409
7970 Noise Detection Algorithm for Skin Disease Image Identification

Authors: Minakshi Mainaji Sonawane, Bharti W. Gawali, Sudhir Mendhekar, Ramesh R. Manza

Abstract:

People's lives and health are severely impacted by skin diseases. A new study proposes an effective method for identifying the different forms of skin diseases. Image denoising is a technique for improving image quality after it has been harmed by noise. The proposed technique is based on the usage of the wavelet transform. Wavelet transform is the best method for analyzing the image due to the ability to split the image into the sub-band, which has been used to estimate the noise ratio at the noisy image. According to experimental results, the proposed method presents the best values for MSE, PSNR, and Entropy for denoised images. we can found in Also, by using different types of wavelet transform filters is make the proposed approach can obtain the best results 23.13, 20.08, 50.7 for the image denoising process

Keywords: MSE, PSNR, entropy, Gaussian filter, DWT

Procedia PDF Downloads 195
7969 The Analysis of Brain Response to Auditory Stimuli through EEG Signals’ Non-Linear Analysis

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to auditory stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to auditory stimuli but provide us with very good recommendations for clinical purposes.

Keywords: auditory stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 514
7968 A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning

Authors: Wei Feilong

Abstract:

In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.

Keywords: Integrated Circuit Visual Positioning, Generalized Hough Transform, Local invariant Generalized Hough Transform, ICpacking equipment

Procedia PDF Downloads 245
7967 An Efficient Encryption Scheme Using DWT and Arnold Transforms

Authors: Ali Abdrhman M. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The color image is decomposed into red, green, and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using a key image that has same original size and is generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours of color image recovery can be obtained with accepted level of distortion using Canny edge detector. Experiments have demonstrated that proposed algorithm can fully encrypt 2D color image and completely reconstructed without any distortion. It has shown that the color image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: color image, wavelet transform, edge detector, Arnold transform, lossy image encryption

Procedia PDF Downloads 453
7966 The Inverse Problem in Energy Beam Processes Using Discrete Adjoint Optimization

Authors: Aitor Bilbao, Dragos Axinte, John Billingham

Abstract:

The inverse problem in Energy Beam (EB) Processes consists of defining the control parameters, in particular the 2D beam path (position and orientation of the beam as a function of time), to arrive at a prescribed solution (freeform surface). This inverse problem is well understood for conventional machining, because the cutting tool geometry is well defined and the material removal is a time independent process. In contrast, EB machining is achieved through the local interaction of a beam of particular characteristics (e.g. energy distribution), which leads to a surface-dependent removal rate. Furthermore, EB machining is a time-dependent process in which not only the beam varies with the dwell time, but any acceleration/deceleration of the machine/beam delivery system, when performing raster paths will influence the actual geometry of the surface to be generated. Two different EB processes, Abrasive Water Machining (AWJM) and Pulsed Laser Ablation (PLA), are studied. Even though they are considered as independent different technologies, both can be described as time-dependent processes. AWJM can be considered as a continuous process and the etched material depends on the feed speed of the jet at each instant during the process. On the other hand, PLA processes are usually defined as discrete systems and the total removed material is calculated by the summation of the different pulses shot during the process. The overlapping of these shots depends on the feed speed and the frequency between two consecutive shots. However, if the feed speed is sufficiently slow compared with the frequency, then consecutive shots are close enough and the behaviour can be similar to a continuous process. Using this approximation a generic continuous model can be described for both processes. The inverse problem is usually solved for this kind of process by simply controlling dwell time in proportion to the required depth of milling at each single pixel on the surface using a linear model of the process. However, this approach does not always lead to the good solution since linear models are only valid when shallow surfaces are etched. The solution of the inverse problem is improved by using a discrete adjoint optimization algorithm. Moreover, the calculation of the Jacobian matrix consumes less computation time than finite difference approaches. The influence of the dynamics of the machine on the actual movement of the jet is also important and should be taken into account. When the parameters of the controller are not known or cannot be changed, a simple approximation is used for the choice of the slope of a step profile. Several experimental tests are performed for both technologies to show the usefulness of this approach.

Keywords: abrasive waterjet machining, energy beam processes, inverse problem, pulsed laser ablation

Procedia PDF Downloads 257
7965 Image Rotation Using an Augmented 2-Step Shear Transform

Authors: Hee-Choul Kwon, Heeyong Kwon

Abstract:

Image rotation is one of main pre-processing steps for image processing or image pattern recognition. It is implemented with a rotation matrix multiplication. It requires a lot of floating point arithmetic operations and trigonometric calculations, so it takes a long time to execute. Therefore, there has been a need for a high speed image rotation algorithm without two major time-consuming operations. However, the rotated image has a drawback, i.e. distortions. We solved the problem using an augmented two-step shear transform. We compare the presented algorithm with the conventional rotation with images of various sizes. Experimental results show that the presented algorithm is superior to the conventional rotation one.

Keywords: high-speed rotation operation, image rotation, transform matrix, image processing, pattern recognition

Procedia PDF Downloads 246
7964 Screening Deformed Red Blood Cells Irradiated by Ionizing Radiations Using Windowed Fourier Transform

Authors: Dahi Ghareab Abdelsalam Ibrahim, R. H. Bakr

Abstract:

Ionizing radiation, such as gamma radiation and X-rays, has many applications in medical diagnoses and cancer treatment. In this paper, we used the windowed Fourier transform to extract the complex image of the deformed red blood cells. The real values of the complex image are used to extract the best fitting of the deformed cell boundary. Male albino rats are irradiated by γ-rays from ⁶⁰Co. The male albino rats are anesthetized with ether, and then blood samples are collected from the eye vein by heparinized capillary tubes for studying the radiation-damaging effect in-vivo by the proposed windowed Fourier transform. The peripheral blood films are prepared according to the Brown method. The peripheral blood film is photographed by using an Automatic Image Contour Analysis system (SAMICA) from ELBEK-Bildanalyse GmbH, Siegen, Germany. The SAMICA system is provided with an electronic camera connected to a computer through a built-in interface card, and the image can be magnified up to 1200 times and displayed by the computer. The images of the peripheral blood films are then analyzed by the windowed Fourier transform method to extract the precise deformation from the best fitting. Based on accurate deformation evaluation of the red blood cells, diseases can be diagnosed in their primary stages.

Keywords: windowed Fourier transform, red blood cells, phase wrapping, Image processing

Procedia PDF Downloads 58
7963 Investigation of the Morphology of SiO2 Nano-Particles Using Different Synthesis Techniques

Authors: E. Gandomkar, S. Sabbaghi

Abstract:

In this paper, the effects of variation synthesized methods on morphology and size of silica nanostructure via modifying sol-gel and precipitation method have been investigated. Meanwhile, resulting products have been characterized by particle size analyzer, scanning electron microscopy (SEM), X-ray Diffraction (XRD) and Fourier transform infrared (FT-IR) spectra. As result, the shape of SiO2 with sol-gel and precipitation methods was spherical but with modifying sol-gel method we have been had nanolayer structure.

Keywords: modified sol-gel, precipitation, nanolayer, Na2SiO3, nanoparticle

Procedia PDF Downloads 267
7962 Nonlinear Adaptive PID Control for a Semi-Batch Reactor Based on an RBF Network

Authors: Magdi. M. Nabi, Ding-Li Yu

Abstract:

Control of a semi-batch polymerization reactor using an adaptive radial basis function (RBF) neural network method is investigated in this paper. A neural network inverse model is used to estimate the valve position of the reactor; this method can identify the controlled system with the RBF neural network identifier. The weights of the adaptive PID controller are timely adjusted based on the identification of the plant and self-learning capability of RBFNN. A PID controller is used in the feedback control to regulate the actual temperature by compensating the neural network inverse model output. Simulation results show that the proposed control has strong adaptability, robustness and satisfactory control performance and the nonlinear system is achieved.

Keywords: Chylla-Haase polymerization reactor, RBF neural networks, feed-forward, feedback control

Procedia PDF Downloads 677
7961 Application of the Bionic Wavelet Transform and Psycho-Acoustic Model for Speech Compression

Authors: Chafik Barnoussi, Mourad Talbi, Adnane Cherif

Abstract:

In this paper we propose a new speech compression system based on the application of the Bionic Wavelet Transform (BWT) combined with the psychoacoustic model. This compression system is a modified version of the compression system using a MDCT (Modified Discrete Cosine Transform) filter banks of 32 filters each and the psychoacoustic model. This modification consists in replacing the banks of the MDCT filter banks by the bionic wavelet coefficients which are obtained from the application of the BWT to the speech signal to be compressed. These two methods are evaluated and compared with each other by computing bits before and bits after compression. They are tested on different speech signals and the obtained simulation results show that the proposed technique outperforms the second technique and this in term of compressed file size. In term of SNR, PSNR and NRMSE, the outputs speech signals of the proposed compression system are with acceptable quality. In term of PESQ and speech signal intelligibility, the proposed speech compression technique permits to obtain reconstructed speech signals with good quality.

Keywords: speech compression, bionic wavelet transform, filterbanks, psychoacoustic model

Procedia PDF Downloads 362
7960 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation

Authors: Oğuzhan Urhan

Abstract:

In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.

Keywords: fast motion estimation; low-complexity motion estimation, video coding

Procedia PDF Downloads 297
7959 Digital Joint Equivalent Channel Hybrid Precoding for Millimeterwave Massive Multiple Input Multiple Output Systems

Authors: Linyu Wang, Mingjun Zhu, Jianhong Xiang, Hanyu Jiang

Abstract:

Aiming at the problem that the spectral efficiency of hybrid precoding (HP) is too low in the current millimeter wave (mmWave) massive multiple input multiple output (MIMO) system, this paper proposes a digital joint equivalent channel hybrid precoding algorithm, which is based on the introduction of digital encoding matrix iteration. First, the objective function is expanded to obtain the relation equation, and the pseudo-inverse iterative function of the analog encoder is derived by using the pseudo-inverse method, which solves the problem of greatly increasing the amount of computation caused by the lack of rank of the digital encoding matrix and reduces the overall complexity of hybrid precoding. Secondly, the analog coding matrix and the millimeter-wave sparse channel matrix are combined into an equivalent channel, and then the equivalent channel is subjected to Singular Value Decomposition (SVD) to obtain a digital coding matrix, and then the derived pseudo-inverse iterative function is used to iteratively regenerate the simulated encoding matrix. The simulation results show that the proposed algorithm improves the system spectral efficiency by 10~20%compared with other algorithms and the stability is also improved.

Keywords: mmWave, massive MIMO, hybrid precoding, singular value decompositing, equivalent channel

Procedia PDF Downloads 71
7958 Application of Optimization Techniques in Overcurrent Relay Coordination: A Review

Authors: Syed Auon Raza, Tahir Mahmood, Syed Basit Ali Bukhari

Abstract:

In power system properly coordinated protection scheme is designed to make sure that only the faulty part of the system will be isolated when abnormal operating condition of the system will reach. The complexity of the system as well as the increased user demand and the deregulated environment enforce the utilities to improve system reliability by using a properly coordinated protection scheme. This paper presents overview of over current relay coordination techniques. Different techniques such as Deterministic Techniques, Meta Heuristic Optimization techniques, Hybrid Optimization Techniques, and Trial and Error Optimization Techniques have been reviewed in terms of method of their implementation, operation modes, nature of distribution system, and finally their advantages as well as the disadvantages.

Keywords: distribution system, relay coordination, optimization, Plug Setting Multiplier (PSM)

Procedia PDF Downloads 366
7957 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission

Authors: Tingwei Shu, Dong Zhou, Chengjun Guo

Abstract:

Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.

Keywords: semantic communication, transformer, wavelet transform, data processing

Procedia PDF Downloads 54
7956 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 34
7955 Principal Component Analysis in Drug-Excipient Interactions

Authors: Farzad Khajavi

Abstract:

Studies about the interaction between active pharmaceutical ingredients (API) and excipients are so important in the pre-formulation stage of development of all dosage forms. Analytical techniques such as differential scanning calorimetry (DSC), Thermal gravimetry (TG), and Furrier transform infrared spectroscopy (FTIR) are commonly used tools for investigating regarding compatibility and incompatibility of APIs with excipients. Sometimes the interpretation of data obtained from these techniques is difficult because of severe overlapping of API spectrum with excipients in their mixtures. Principal component analysis (PCA) as a powerful factor analytical method is used in these situations to resolve data matrices acquired from these analytical techniques. Binary mixtures of API and interested excipients are considered and produced. Peaks of FTIR, DSC, or TG of pure API and excipient and their mixtures at different mole ratios will construct the rows of the data matrix. By applying PCA on the data matrix, the number of principal components (PCs) is determined so that it contains the total variance of the data matrix. By plotting PCs or factors obtained from the score of the matrix in two-dimensional spaces if the pure API and its mixture with the excipient at the high amount of API and the 1:1mixture form a separate cluster and the other cluster comprise of the pure excipient and its blend with the API at the high amount of excipient. This confirms the existence of compatibility between API and the interested excipient. Otherwise, the incompatibility will overcome a mixture of API and excipient.

Keywords: API, compatibility, DSC, TG, interactions

Procedia PDF Downloads 102
7954 Comparative Analysis of Water-Based Alumina Nanoparticles with Water-Based Cupric Nanoparticles Past an Exponentially Accelerated Vertical Radiative Riga Plate with Heat Transfer

Authors: Kanayo Kenneth Asogwa

Abstract:

The influence of the flow of nanoparticles in nanofluids across a vertical surface is significant, and its application in medical sciences, engineering, pharmaceutical, and food industries is enormous & widely published. However, the comparative examination of alumina nanoparticles with cupric nanoparticles past a rapid progressive Riga plate remains unknown. Thus, this report investigates water-based alumina and cupric nanoparticles passing through an exponentially accelerated Riga plate. Nanofluids containing copper (II) oxide (CuO) and aluminum oxide (Al2O3) nanoparticles are considered. The Laplace transform technique is used to solve the partial differential equations guiding the flow. The effect of various factors on skin friction coefficient, Nusselt number, velocity and temperature profiles is investigated and reported in tabular and graphical form. The upsurge of Modified Hartmann number and radiative impact improves copper (II) oxide nanofluid compared to aluminum oxide nanofluid due to Lorentz force and since CuO is a better heat conductor. At the same time, heat absorption and reactive species favor a slight decline in Alumina nanofluid than Cupric nanofluid in the thermal and velocity fields. The higher density of Cupric nanofluid is enhanced by increasing nanoparticle volume fraction over Alumina nanofluid with a decline in velocity distribution.

Keywords: alumina, cupric, nanoparticles, water-based

Procedia PDF Downloads 180
7953 Spatial Interpolation of Intermediate Soil Properties to Enhance Geotechnical Surveying for Foundation Design

Authors: Yelbek B. Utepov, Assel T. Mukhamejanova, Aliya K. Aldungarova, Aida G. Nazarova, Sabit A. Karaulov, Nurgul T. Alibekova, Aigul K. Kozhas, Dias Kazhimkanuly, Akmaral K. Tleubayeva

Abstract:

This research focuses on enhancing geotechnical surveying for foundation design through the spatial interpolation of intermediate soil properties. Traditional geotechnical practices rely on discrete data from borehole drilling, soil sampling, and laboratory analyses, often neglecting the continuous nature of soil properties and disregarding values in intermediate locations. This study challenges these omissions by emphasizing interpolation techniques such as Kriging, Inverse Distance Weighting, and Spline interpolation to capture the nuanced spatial variations in soil properties. The methodology is applied to geotechnical survey data from two construction sites in Astana, Kazakhstan, revealing continuous representations of Young's Modulus, Cohesion, and Friction Angle. The spatial heatmaps generated through interpolation offered valuable insights into the subsurface environment, highlighting heterogeneity and aiding in more informed foundation design decisions for considered cites. Moreover, intriguing patterns of heterogeneity, as well as visual clusters and transitions between soil classes, were explored within seemingly uniform layers. The study bridges the gap between discrete borehole samples and the continuous subsurface, contributing to the evolution of geotechnical engineering practices. The proposed approach, utilizing open-source software geographic information systems, provides a practical tool for visualizing soil characteristics and may pave the way for future advancements in geotechnical surveying and foundation design.

Keywords: soil mechanical properties, spatial interpolation, inverse distance weighting, heatmaps

Procedia PDF Downloads 43
7952 Synchrotron X-Ray Based Investigation of As and Fe Bonding Environment in Collard Green Tissue Samples at Different Growth Stages

Authors: Sunil Dehipawala, Aregama Sirisumana, stephan Smith, P. Schneider, G. Tremberger Jr, D. Lieberman, Todd Holden, T. Cheung

Abstract:

The arsenic and iron environments in different growth stages have been studied with EXAFS and XANES using Brookhaven Synchrotron Light Source. Collard Greens plants were grown and tissue samples were harvested. The project studied the EXAFS and XANES of tissue samples using As and Fe K-edges. The Fe absorption and the Fourier transform bond length information were used as a control comparison. The Fourier transform of the XAFS data revealed the coexistence of As (III) and As (V) in the As bonding environment inside the studied plant tissue samples, although the soil only had As (III). The data suggests that Collard Greens has a novel pathway to handle arsenic absorption in soil.

Keywords: EXAFS, fourier transform, metalloproteins, XANES

Procedia PDF Downloads 298