Search results for: feature noise
1287 Quality Assurance in Cardiac Disorder Detection Images
Authors: Anam Naveed, Asma Andleeb, Mehreen Sirshar
Abstract:
In the article, Image processing techniques have been applied on cardiac images for enhancing the image quality. Two types of methodologies considers for survey, invasive techniques and non-invasive techniques. Different image processes for improvement of cardiac image quality and reduce the amount of radiation exposure for invasive techniques are explored. Different image processing algorithms for enhancing the noninvasive cardiac image qualities are described. Beside these two methodologies, third methodology has applied on live streaming of heart rate on ECG window for extracting necessary information, removing noise and enhancing quality. Sensitivity analyses have been carried out to investigate the impacts of cardiac images for diagnosis of cardiac arteries disease and how the enhancement on images will help the cardiologist to diagnoses disease. The paper evaluates strengths and weaknesses of different techniques applied for improved the image quality and draw a conclusion. Some specific limitations must be considered for whole survey, like the patient heart beat must be 70-75 beats/minute while doing the angiography, similarly patient weight and exposure radiation amount has some limitation.Keywords: cardiac images, CT angiography, critical analysis, exposure radiation, invasive techniques, invasive techniques, non-invasive techniques
Procedia PDF Downloads 3521286 A Passive Digital Video Authentication Technique Using Wavelet Based Optical Flow Variation Thresholding
Authors: R. S. Remya, U. S. Sethulekshmi
Abstract:
Detecting the authenticity of a video is an important issue in digital forensics as Video is used as a silent evidence in court such as in child pornography, movie piracy cases, insurance claims, cases involving scientific fraud, traffic monitoring etc. The biggest threat to video data is the availability of modern open video editing tools which enable easy editing of videos without leaving any trace of tampering. In this paper, we propose an efficient passive method for inter-frame video tampering detection, its type and location by estimating the optical flow of wavelet features of adjacent frames and thresholding the variation in the estimated feature. The performance of the algorithm is compared with the z-score thresholding and achieved an efficiency above 95% on all the tested databases. The proposed method works well for videos with dynamic (forensics) as well as static (surveillance) background.Keywords: discrete wavelet transform, optical flow, optical flow variation, video tampering
Procedia PDF Downloads 3591285 Key Competences in Economics and Business Field: The Employers’ Side of the Story
Authors: Bruno Škrinjarić
Abstract:
Rapid technological developments and increase in organizations’ interdependence on international scale are changing the traditional workplace paradigm. A key feature of knowledge based economy is that employers are looking for individuals that possess both specific academic skills and knowledge, and also capability to be proactive and respond to problems creatively and autonomously. The focus of this paper is workers with Economics and Business background and its goals are threefold: (1) to explore wide range of competences and identify which are the most important to employers; (2) to investigate the existence and magnitude of gap between required and possessed level of a certain competency; and (3) to inquire how this gap is connected with performance of a company. A study was conducted on a representative sample of Croatian enterprises during the spring of 2016. Results show that generic, rather than specific, competences are more important to employers and the gap between the relative importance of certain competence and its current representation in existing workforce is greater for generic competences than for specific. Finally, results do not support the hypothesis that this gap is correlated with firms’ performance.Keywords: competency gap, competency matching, key competences, firm performance
Procedia PDF Downloads 3331284 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder
Authors: Dua Hişam, Serhat İkizoğlu
Abstract:
Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting
Procedia PDF Downloads 691283 Spontaneous Message Detection of Annoying Situation in Community Networks Using Mining Algorithm
Authors: P. Senthil Kumari
Abstract:
Main concerns in data mining investigation are social controls of data mining for handling ambiguity, noise, or incompleteness on text data. We describe an innovative approach for unplanned text data detection of community networks achieved by classification mechanism. In a tangible domain claim with humble secrecy backgrounds provided by community network for evading annoying content is presented on consumer message partition. To avoid this, mining methodology provides the capability to unswervingly switch the messages and similarly recover the superiority of ordering. Here we designated learning-centered mining approaches with pre-processing technique to complete this effort. Our involvement of work compact with rule-based personalization for automatic text categorization which was appropriate in many dissimilar frameworks and offers tolerance value for permits the background of comments conferring to a variety of conditions associated with the policy or rule arrangements processed by learning algorithm. Remarkably, we find that the choice of classifier has predicted the class labels for control of the inadequate documents on community network with great value of effect.Keywords: text mining, data classification, community network, learning algorithm
Procedia PDF Downloads 5081282 Management of Femoral Neck Stress Fractures at a Specialist Centre and Predictive Factors to Return to Activity Time: An Audit
Authors: Charlotte K. Lee, Henrique R. N. Aguiar, Ralph Smith, James Baldock, Sam Botchey
Abstract:
Background: Femoral neck stress fractures (FNSF) are uncommon, making up 1 to 7.2% of stress fractures in healthy subjects. FNSFs are prevalent in young women, military recruits, endurance athletes, and individuals with energy deficiency syndrome or female athlete triad. Presentation is often non-specific and is often misdiagnosed following the initial examination. There is limited research addressing the return–to–activity time after FNSF. Previous studies have demonstrated prognostic time predictions based on various imaging techniques. Here, (1) OxSport clinic FNSF practice standards are retrospectively reviewed, (2) FNSF cohort demographics are examined, (3) Regression models were used to predict return–to–activity prognosis and consequently determine bone stress risk factors. Methods: Patients with a diagnosis of FNSF attending Oxsport clinic between 01/06/2020 and 01/01/2020 were selected from the Rheumatology Assessment Database Innovation in Oxford (RhADiOn) and OxSport Stress Fracture Database (n = 14). (1) Clinical practice was audited against five criteria based on local and National Institute for Health Care Excellence guidance, with a 100% standard. (2) Demographics of the FNSF cohort were examined with Student’s T-Test. (3) Lastly, linear regression and Random Forest regression models were used on this patient cohort to predict return–to–activity time. Consequently, an analysis of feature importance was conducted after fitting each model. Results: OxSport clinical practice met standard (100%) in 3/5 criteria. The criteria not met were patient waiting times and documentation of all bone stress risk factors. Importantly, analysis of patient demographics showed that of the population with complete bone stress risk factor assessments, 53% were positive for modifiable bone stress risk factors. Lastly, linear regression analysis was utilized to identify demographic factors that predicted return–to–activity time [R2 = 79.172%; average error 0.226]. This analysis identified four key variables that predicted return-to-activity time: vitamin D level, total hip DEXA T value, femoral neck DEXA T value, and history of an eating disorder/disordered eating. Furthermore, random forest regression models were employed for this task [R2 = 97.805%; average error 0.024]. Analysis of the importance of each feature again identified a set of 4 variables, 3 of which matched with the linear regression analysis (vitamin D level, total hip DEXA T value, and femoral neck DEXA T value) and the fourth: age. Conclusion: OxSport clinical practice could be improved by more comprehensively evaluating bone stress risk factors. The importance of this evaluation is demonstrated by the population found positive for these risk factors. Using this cohort, potential bone stress risk factors that significantly impacted return-to-activity prognosis were predicted using regression models.Keywords: eating disorder, bone stress risk factor, femoral neck stress fracture, vitamin D
Procedia PDF Downloads 1831281 The Role of Polar Body in the Female Gamete
Authors: Parsa Sheikhzadeh
Abstract:
Polar bodies are cells that form by oogenesis in meiosis which differentiate and develop from oocytes. Although in many animals, these cells often die following meiotic maturation of the oocyte. Oocyte activation is during mammalian fertilization, sperm is fused with the oocyte's membrane, triggering the resumption of meiosis from the metaphase II arrest, the extrusion of the second polar body, and the exocytosis of cortical granules. The origin recognition complex proteins 4 (ORC4) forms a cage around the set of chromosomes that will be extruded during polar body formation before it binds to the chromatin shortly before zygotic DNA replication. One unique feature of the female gamete is that the polar bodies can provide beneficial information about the genetic background of the oocyte without potentially destroying it. Testing at the polar body (PB) stage was the least accurate, mainly due to the high incidence of post-zygotic events. On the other hand, the results from PB1-MII oocyte pair validated that PB1 contains nearly the same methylome (average Pearson correlation is 0.92) with sibling MII oocyte. In this article, we comprehensively examine the role of polar bodies in female human gametes.Keywords: polar bodies, ORC4, oocyte, genetic, methylome, gamete, female
Procedia PDF Downloads 941280 Credit Risk Evaluation Using Genetic Programming
Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira
Abstract:
Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.Keywords: credit risk assessment, rule generation, genetic programming, feature selection
Procedia PDF Downloads 3531279 The Implementation of the Javanese Lettered-Manuscript Image Preprocessing Stage Model on the Batak Lettered-Manuscript Image
Authors: Anastasia Rita Widiarti, Agus Harjoko, Marsono, Sri Hartati
Abstract:
This paper presents the results of a study to test whether the Javanese character manuscript image preprocessing model that have been more widely applied, can also be applied to segment of the Batak characters manuscripts. The treatment process begins by converting the input image into a binary image. After the binary image is cleaned of noise, then the segmentation lines using projection profile is conducted. If unclear histogram projection is found, then the smoothing process before production indexes line segments is conducted. For each line image which has been produced, then the segmentation scripts in the line is applied, with regard of the connectivity between pixels which making up the letters that there is no characters are truncated. From the results of manuscript preprocessing system prototype testing, it is obtained the information about the system truth percentage value on pieces of Pustaka Batak Podani Ma AjiMamisinon manuscript ranged from 65% to 87.68% with a confidence level of 95%. The value indicates the truth percentage shown the initial processing model in Javanese characters manuscript image can be applied also to the image of the Batak characters manuscript.Keywords: connected component, preprocessing, manuscript image, projection profiles
Procedia PDF Downloads 4001278 Strategies for the Optimization of Ground Resistance in Large Scale Foundations for Optimum Lightning Protection
Authors: Oibar Martinez, Clara Oliver, Jose Miguel Miranda
Abstract:
In this paper, we discuss the standard improvements which can be made to reduce the earth resistance in difficult terrains for optimum lightning protection, what are the practical limitations, and how the modeling can be refined for accurate diagnostics and ground resistance minimization. Ground resistance minimization can be made via three different approaches: burying vertical electrodes connected in parallel, burying horizontal conductive plates or meshes, or modifying the own terrain, either by changing the entire terrain material in a large volume or by adding earth-enhancing compounds. The use of vertical electrodes connected in parallel pose several practical limitations. In order to prevent loss of effectiveness, it is necessary to keep a minimum distance between each electrode, which is typically around five times larger than the electrode length. Otherwise, the overlapping of the local equipotential lines around each electrode reduces the efficiency of the configuration. The addition of parallel electrodes reduces the resistance and facilitates the measurement, but the basic parallel resistor formula of circuit theory will always underestimate the final resistance. Numerical simulation of equipotential lines around the electrodes overcomes this limitation. The resistance of a single electrode will always be proportional to the soil resistivity. The electrodes are usually installed with a backfilling material of high conductivity, which increases the effective diameter. However, the improvement is marginal, since the electrode diameter counts in the estimation of the ground resistance via a logarithmic function. Substances that are used for efficient chemical treatment must be environmentally friendly and must feature stability, high hygroscopicity, low corrosivity, and high electrical conductivity. A number of earth enhancement materials are commercially available. Many are comprised of carbon-based materials or clays like bentonite. These materials can also be used as backfilling materials to reduce the resistance of an electrode. Chemical treatment of soil has environmental issues. Some products contain copper sulfate or other copper-based compounds, which may not be environmentally friendly. Carbon-based compounds are relatively inexpensive and they do have very low resistivities, but they also feature corrosion issues. Typically, the carbon can corrode and destroy a copper electrode in around five years. These compounds also have potential environmental concerns. Some earthing enhancement materials contain cement, which, after installation acquire properties that are very close to concrete. This prevents the earthing enhancement material from leaching into the soil. After analyzing different configurations, we conclude that a buried conductive ring with vertical electrodes connected periodically should be the optimum baseline solution for the grounding of a large size structure installed on a large resistivity terrain. In order to show this, a practical example is explained here where we simulate the ground resistance of a conductive ring buried in a terrain with a resistivity in the range of 1 kOhm·m.Keywords: grounding improvements, large scale scientific instrument, lightning risk assessment, lightning standards
Procedia PDF Downloads 1391277 Red Blood Cells Deformability: A Chaotic Process
Authors: Ana M. Korol, Bibiana Riquelme, Osvaldo A. Rosso
Abstract:
Since erythrocyte deformability analysis is mostly qualitative, the development of quantitative nonlinear methods is crucial for restricting subjectivity in the study of cell behaviour. An electro-optic mechanic system called erythrodeformeter has been developed and constructed in our laboratory in order to evaluate the erythrocytes' viscoelasticity. A numerical method formulated on the basis of fractal approximation for ordinary (OBM) and fractionary Brownian motion (FBM), as well as wavelet transform analysis, are proposed to distinguish chaos from noise based on the assumption that diffractometric data involves both deterministic and stochastic components, so it could be modelled as a system of bounded correlated random walk. Here we report studies on 25 donors: 4 alpha thalassaemic patients, 11 beta thalassaemic patients, and 10 healthy controls non-alcoholic and non-smoker individuals. The Correlation Coefficient, a nonlinear parameter, showed evidence of the changes in the erythrocyte deformability; the Wavelet Entropy could quantify those differences which are detected by the light diffraction patterns. Such quantifiers allow a good deal of promise and the possibility of a better understanding of the rheological erythrocytes aspects and also could help in clinical diagnosis.Keywords: red blood cells, deformability, nonlinear dynamics, chaos theory, wavelet trannsform
Procedia PDF Downloads 591276 Control of Single Axis Magnetic Levitation System Using Fuzzy Logic Control
Authors: A. M. Benomair, M. O. Tokhi
Abstract:
This paper presents the investigation on a system model for the stabilization of a Magnetic Levitation System (Maglev’s). The magnetic levitation system is a challenging nonlinear mechatronic system in which an electromagnetic force is required to suspend an object (metal sphere) in air space. The electromagnetic force is very sensitive to the noise which can create acceleration forces on the metal sphere, causing the sphere to move into the unbalanced region. Maglev’s give the contribution in industry and this system has reduce the power consumption, has increase the power efficiency and reduce the cost maintenance. The common applications for Maglev’s Power Generation (e.g. wind turbine), Maglev’s trains and Medical Device (e.g. Magnetically suspended Artificial Heart Pump). This paper presents the comparison between dynamic response and robust characteristic for both conventional PD and Fuzzy PD controller. The main contribution of this paper is the proof of fuzzy PD type stabilization and robustness. By use of a method to tune the scaling factors of the linear PD type fuzzy controller from an equivalent tuned conventional PD.Keywords: magnetic levitation system, PD controller, Fuzzy Logic Control, Fuzzy PD
Procedia PDF Downloads 2731275 Multimodal Direct Neural Network Positron Emission Tomography Reconstruction
Authors: William Whiteley, Jens Gregor
Abstract:
In recent developments of direct neural network based positron emission tomography (PET) reconstruction, two prominent architectures have emerged for converting measurement data into images: 1) networks that contain fully-connected layers; and 2) networks that primarily use a convolutional encoder-decoder architecture. In this paper, we present a multi-modal direct PET reconstruction method called MDPET, which is a hybrid approach that combines the advantages of both types of networks. MDPET processes raw data in the form of sinograms and histo-images in concert with attenuation maps to produce high quality multi-slice PET images (e.g., 8x440x440). MDPET is trained on a large whole-body patient data set and evaluated both quantitatively and qualitatively against target images reconstructed with the standard PET reconstruction benchmark of iterative ordered subsets expectation maximization. The results show that MDPET outperforms the best previously published direct neural network methods in measures of bias, signal-to-noise ratio, mean absolute error, and structural similarity.Keywords: deep learning, image reconstruction, machine learning, neural network, positron emission tomography
Procedia PDF Downloads 1111274 Rumination in Borderline Personality Disorder: A Meta-Analytic Review
Authors: Mara J. Richman, Zsolt Unoka, Robert Dudas, Zsolt Demetrovics
Abstract:
Borderline personality disorder (BPD) is characterized by deficits in emotion regulation and effective liability. Of this domain, ruminative behaviors have been considered a core feature of emotion dysregulation difficulties. Taking this into consideration, a meta-analysis was performed to assess how BPD symptoms correlate with rumination, while also considering clinical moderator variables such as comorbidity, GAF score, and type of BPD symptom and demographic moderator variables such as age, gender, and education level. Analysis of correlation across rumination domains for the entire sample revealed a medium overall correlation. When assessing types of rumination, the largest correlation was among pain rumination followed by anger, depressive, and anxious rumination. Furthermore, affective instability had the strongest correlation with increased rumination, followed by unstable relationships, identity disturbance, and self-harm/ impulsivity, respectively. Demographic variables showed no significance. Clinical implications are considered and further therapeutic interventions are discussed in the context of rumination.Keywords: borderline personality disorder, meta-analysis, rumination, symptoms
Procedia PDF Downloads 1941273 Optimized Simultaneous Determination of Theobromine and Caffeine in Fermented and Unfermented Cacao Beans and in Cocoa Products Using Step Gradient Solvent System in Reverse Phase HPLC
Authors: Ian Marc G. Cabugsa, Kim Ryan A. Won
Abstract:
Fast, reliable and simultaneous HPLC analysis of theobromine and caffeine in cacao and cocoa products was optimized in this study. The samples tested were raw, fermented, and roasted cacao beans as well as commercially available cocoa products. The HPLC analysis was carried out using step gradient solvent system with acetonitrile and water buffered with H3PO4 as the mobile phase. The HPLC system was optimized using 273 nm wavelength at 35 °C for the column temperature with a flow rate of 1.0 mL/min. Using this method, the theobromine percent recovery mean, Limit of Detection (LOD) and Limit of Quantification (LOQ) is 118.68(±3.38)%, 0.727 and 1.05 respectively. The percent recovery mean, LOD and LOQ for caffeine is 105.53(±3.25)%, 2.42 and 3.50 respectively. The inter-day and intra-day precision for theobromine is 4.31% and 4.48% respectively, while 7.02% and 7.03% was for caffeine respectively. Compared to the standard method in AOAC using methanol in isocratic solvent system, the results of the study produced lesser chromatogram noise with emphasis on theobromine and caffeine. The method is readily usable for cacao and cocoa substances analyses using HPLC with step gradient capability.Keywords: cacao, caffeine, HPLC, step gradient solvent system, theobromine
Procedia PDF Downloads 2811272 Performance Evaluation of MIMO-OFDM Communication Systems
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
This paper evaluates the bit error rate (BER) performance of MIMO-OFDM communication system. MIMO system uses multiple transmitting and receiving antennas with different coding techniques to either enhance the transmission diversity or spatial multiplexing gain. Utilizing alamouti algorithm were the same information transmitted over multiple antennas at different time intervals and then collected again at the receivers to minimize the probability of error, combat fading and thus improve the received signal to noise ratio. While utilizing V-BLAST algorithm, the transmitted signals are divided into different transmitting channels and transferred over the channel to be received by different receiving antennas to increase the transmitted data rate and achieve higher throughput. The paper provides a study of different diversity gain coding schemes and spatial multiplexing coding for MIMO systems. A comparison of various channels' estimation and equalization techniques are given. The simulation is implemented using MATLAB, and the results had shown the performance of transmission models under different channel environments.Keywords: MIMO communication, BER, space codes, channels, alamouti, V-BLAST
Procedia PDF Downloads 1751271 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: calibration model, monitoring, quality improvement, feature selection
Procedia PDF Downloads 3561270 Numerical Investigations on the Coanda Effect
Authors: Florin Frunzulica, Alexandru Dumitrache, Octavian Preotu
Abstract:
The Coanda effect consists of the tendency of a jet to remain attached to a sufficiently long/large convex surface. Flows deflected by a curved surface have caused great interest during last fifty years a major interest in the study of this phenomenon is caused by the possibility of using this effect to aircraft with short take-off and landing, for thrust vectoring. It is also used in applications involving mixing two of more fluids, noise attenuation, ventilation, etc. The paper proposes the numerical study of an aerodynamic configuration that can passively amplify the Coanda effect. On a wing flaps with predetermined configuration, a channel is applied between two particular zones, a low-pressure one and a high-pressure another one, respectively. The secondary flow through this channel yields a gap between the jet and the convex surface, maintaining the jet attached on a longer distance. The section altering-based active control of the secondary flow through the channel controls the attachment of the jet to the surface and automatically controls the deviation angle of the jet. The numerical simulations have been performed in Ansys Fluent for a series of wing flaps-channel configurations with varying jet velocity. The numerical results are in good agreement with experimental results.Keywords: blowing jet, CFD, Coanda effect, circulation control
Procedia PDF Downloads 3461269 Comparative Study of Soliton Collisions in Uniform and Nonuniform Magnetized Plasma
Authors: Renu Tomar, Hitendra K. Malik, Raj P. Dahiya
Abstract:
Similar to the sound waves in air, plasmas support the propagation of ion waves, which evolve into the solitary structures when the effect of non linearity and dispersion are balanced. The ion acoustic solitary waves have been investigated in details in homogeneous plasmas, inhomogeneous plasmas, and magnetized plasmas. The ion acoustic solitary waves are also found to reflect from a density gradient or boundary present in the plasma after propagating. Another interesting feature of the solitary waves is their collision. In the present work, we carry out analytical calculations for the head-on collision of solitary waves in a magnetized plasma which has dust grains in addition to the ions and electrons. For this, we employ Poincar´e-Lighthill-Kuo (PLK) method. To lowest nonlinear order, the problem of colliding solitary waves leads to KdV (modified KdV) equations and also yields the phase shifts that occur in the interaction. These calculations are accomplished for the uniform and nonuniform plasmas, and the results on the soliton properties are discussed in detail.Keywords: inhomogeneous magnetized plasma, dust charging, soliton collisions, magnetized plasma
Procedia PDF Downloads 4701268 Improvement of Bone Scintography Image Using Image Texture Analysis
Authors: Yousif Mohamed Y. Abdallah, Eltayeb Wagallah
Abstract:
Image enhancement allows the observer to see details in images that may not be immediately observable in the original image. Image enhancement is the transformation or mapping of one image to another. The enhancement of certain features in images is accompanied by undesirable effects. To achieve maximum image quality after denoising, a new, low order, local adaptive Gaussian scale mixture model and median filter were presented, which accomplishes nonlinearities from scattering a new nonlinear approach for contrast enhancement of bones in bone scan images using both gamma correction and negative transform methods. The usual assumption of a distribution of gamma and Poisson statistics only lead to overestimation of the noise variance in regions of low intensity but to underestimation in regions of high intensity and therefore to non-optional results. The contrast enhancement results were obtained and evaluated using MatLab program in nuclear medicine images of the bones. The optimal number of bins, in particular the number of gray-levels, is chosen automatically using entropy and average distance between the histogram of the original gray-level distribution and the contrast enhancement function’s curve.Keywords: bone scan, nuclear medicine, Matlab, image processing technique
Procedia PDF Downloads 5091267 Tree Species Classification Using Effective Features of Polarimetric SAR and Hyperspectral Images
Authors: Milad Vahidi, Mahmod R. Sahebi, Mehrnoosh Omati, Reza Mohammadi
Abstract:
Forest management organizations need information to perform their work effectively. Remote sensing is an effective method to acquire information from the Earth. Two datasets of remote sensing images were used to classify forested regions. Firstly, all of extractable features from hyperspectral and PolSAR images were extracted. The optical features were spectral indexes related to the chemical, water contents, structural indexes, effective bands and absorption features. Also, PolSAR features were the original data, target decomposition components, and SAR discriminators features. Secondly, the particle swarm optimization (PSO) and the genetic algorithms (GA) were applied to select optimization features. Furthermore, the support vector machine (SVM) classifier was used to classify the image. The results showed that the combination of PSO and SVM had higher overall accuracy than the other cases. This combination provided overall accuracy about 90.56%. The effective features were the spectral index, the bands in shortwave infrared (SWIR) and the visible ranges and certain PolSAR features.Keywords: hyperspectral, PolSAR, feature selection, SVM
Procedia PDF Downloads 4161266 Social Media Mining with R. Twitter Analyses
Authors: Diana Codat
Abstract:
Tweets' analysis is part of text mining. Each document is a written text. It's possible to apply the usual text search techniques, in particular by switching to the bag-of-words representation. But the tweets induce peculiarities. Some may enrich the analysis. Thus, their length is calibrated (at least as far as public messages are concerned), special characters make it possible to identify authors (@) and themes (#), the tweet and retweet mechanisms make it possible to follow the diffusion of the information. Conversely, other characteristics may disrupt the analyzes. Because space is limited, authors often use abbreviations, emoticons to express feelings, and they do not pay much attention to spelling. All this creates noise that can complicate the task. The tweets carry a lot of potentially interesting information. Their exploitation is one of the main axes of the analysis of the social networks. We show how to access Twitter-related messages. We will initiate a study of the properties of the tweets, and we will follow up on the exploitation of the content of the messages. We will work under R with the package 'twitteR'. The study of tweets is a strong focus of analysis of social networks because Twitter has become an important vector of communication. This example shows that it is easy to initiate an analysis from data extracted directly online. The data preparation phase is of great importance.Keywords: data mining, language R, social networks, Twitter
Procedia PDF Downloads 1841265 An Inverse Heat Transfer Algorithm for Predicting the Thermal Properties of Tumors during Cryosurgery
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study aimed at developing an inverse heat transfer approach for predicting the time-varying freezing front and the temperature distribution of tumors during cryosurgery. Using a temperature probe pressed against the layer of tumor, the inverse approach is able to predict simultaneously the metabolic heat generation and the blood perfusion rate of the tumor. Once these parameters are predicted, the temperature-field and time-varying freezing fronts are determined with the direct model. The direct model rests on one-dimensional Pennes bioheat equation. The phase change problem is handled with the enthalpy method. The Levenberg-Marquardt Method (LMM) combined to the Broyden Method (BM) is used to solve the inverse model. The effect (a) of the thermal properties of the diseased tissues; (b) of the initial guesses for the unknown thermal properties; (c) of the data capture frequency; and (d) of the noise on the recorded temperatures is examined. It is shown that the proposed inverse approach remains accurate for all the cases investigated.Keywords: cryosurgery, inverse heat transfer, Levenberg-Marquardt method, thermal properties, Pennes model, enthalpy method
Procedia PDF Downloads 2001264 2D and 3D Unsteady Simulation of the Heat Transfer in the Sample during Heat Treatment by Moving Heat Source
Authors: Zdeněk Veselý, Milan Honner, Jiří Mach
Abstract:
The aim of the performed work is to establish the 2D and 3D model of direct unsteady task of sample heat treatment by moving source employing computer model on the basis of finite element method. The complex boundary condition on heat loaded sample surface is the essential feature of the task. Computer model describes heat treatment of the sample during heat source movement over the sample surface. It is started from the 2D task of sample cross section as a basic model. Possibilities of extension from 2D to 3D task are discussed. The effect of the addition of third model dimension on the temperature distribution in the sample is showed. Comparison of various model parameters on the sample temperatures is observed. Influence of heat source motion on the depth of material heat treatment is shown for several velocities of the movement. Presented computer model is prepared for the utilization in laser treatment of machine parts.Keywords: computer simulation, unsteady model, heat treatment, complex boundary condition, moving heat source
Procedia PDF Downloads 3941263 Human Action Recognition Using Wavelets of Derived Beta Distributions
Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel
Abstract:
In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet
Procedia PDF Downloads 4111262 An Intelligent Traffic Management System Based on the WiFi and Bluetooth Sensing
Authors: Hamed Hossein Afshari, Shahrzad Jalali, Amir Hossein Ghods, Bijan Raahemi
Abstract:
This paper introduces an automated clustering solution that applies to WiFi/Bluetooth sensing data and is later used for traffic management applications. The paper initially summarizes a number of clustering approaches and thereafter shows their performance for noise removal. In this context, clustering is used to recognize WiFi and Bluetooth MAC addresses that belong to passengers traveling by a public urban transit bus. The main objective is to build an intelligent system that automatically filters out MAC addresses that belong to persons located outside the bus for different routes in the city of Ottawa. The proposed intelligent system alleviates the need for defining restrictive thresholds that however reduces the accuracy as well as the range of applicability of the solution for different routes. This paper moreover discusses the performance benefits of the presented clustering approaches in terms of the accuracy, time and space complexity, and the ease of use. Note that results of clustering can further be used for the purpose of the origin-destination estimation of individual passengers, predicting the traffic load, and intelligent management of urban bus schedules.Keywords: WiFi-Bluetooth sensing, cluster analysis, artificial intelligence, traffic management
Procedia PDF Downloads 2411261 User Guidance for Effective Query Interpretation in Natural Language Interfaces to Ontologies
Authors: Aliyu Isah Agaie, Masrah Azrifah Azmi Murad, Nurfadhlina Mohd Sharef, Aida Mustapha
Abstract:
Natural Language Interfaces typically support a restricted language and also have scopes and limitations that naïve users are unaware of, resulting in errors when the users attempt to retrieve information from ontologies. To overcome this challenge, an auto-suggest feature is introduced into the querying process where users are guided through the querying process using interactive query construction system. Guiding users to formulate their queries, while providing them with an unconstrained (or almost unconstrained) way to query the ontology results in better interpretation of the query and ultimately lead to an effective search. The approach described in this paper is unobtrusive and subtly guides the users, so that they have a choice of either selecting from the suggestion list or typing in full. The user is not coerced into accepting system suggestions and can express himself using fragments or full sentences.Keywords: auto-suggest, expressiveness, habitability, natural language interface, query interpretation, user guidance
Procedia PDF Downloads 4741260 Performance Evaluation of Vermiculite as Adsorbent Material for Solar-Assisted Air-Conditioning in Tropical Climate
Authors: Norhayati Mat Wajid, Abdul Murad Zainal Abidin, Hasila Jarimi, Kamaruzaman Sopian, Adnan Ibrahim, Ahmad Fazlizan, Afif Safwan
Abstract:
Solar-adsorption air-conditioning system (SADCS) is an alternative to the conventional vapor compression system (VCS). SADCS have advantages over VCS system, such as 1) a green cooling technology which utilizes solar energy to drive the adsorption/desorption cycle, 2) can be operated using green refrigerant HFC free pure water, 3) mechanically simpler, and 4) lower operating noise level since it has no moving parts other than the magnetic valves. Several advancements have been achieved in these fields in the last decade, but further research is still needed to escalate this technology to a practical level. Hence, this paper presents a literature survey and a review that add insights into the current state-of-the-art of SADCS technologies with emphasis on the practical researches that were conducted at the laboratory scale and commercial level. In this paper, the performance evaluation of vermiculite as adsorbent material for SADCS in tropical climate discussed in comparison to other adsorbent material such as silica gel.Keywords: adsorption cooling, solar-assisted cooling, HVAC, tropical climate, solar thermal
Procedia PDF Downloads 1541259 Integration of Acoustic Solutions for Classrooms
Authors: Eyibo Ebengeobong Eddie, Halil Zafer Alibaba
Abstract:
The neglect of classroom acoustics is dominant in most educational facilities, meanwhile, hearing and listening is the learning process in this kind of facilities. A classroom should therefore be an environment that encourages listening, without an obstacles to understanding what is being taught. Although different studies have shown teachers to complain that noise is the everyday factor that causes stress in classroom, the capacity of individuals to understand speech is further affected by Echoes, Reverberation, and room modes. It is therefore necessary for classrooms to have an ideal acoustics to aid the intelligibility of students in the learning process. The influence of these acoustical parameters on learning and teaching in schools needs to be further researched upon to enhance the teaching and learning capacity of both teacher and student. For this reason, there is a strong need to provide and collect data to analyse and define the suitable quality of classrooms needed for a learning environment. Research has shown that acoustical problems are still experienced in both newer and older schools. However, recently, principle of acoustics has been analysed and room acoustics can now be measured with various technologies and sound systems to improve and solve the problem of acoustics in classrooms. These acoustic solutions, materials, construction methods and integration processes would be discussed in this paper.Keywords: classroom, acoustics, materials, integration, speech intelligibility
Procedia PDF Downloads 4171258 Carbohydrate Intake Estimation in Type I Diabetic Patients Described by UVA/Padova Model
Authors: David A. Padilla, Rodolfo Villamizar
Abstract:
In recent years, closed loop control strategies have been developed in order to establish a healthy glucose profile in type 1 diabetic mellitus (T1DM) patients. However, the controller itself is unable to define a suitable reference trajectory for glucose. In this paper, a control strategy Is proposed where the shape of the reference trajectory is generated bases in the amount of carbohydrates present during the digestive process, due to the effect of carbohydrate intake. Since there no exists a sensor to measure the amount of carbohydrates consumed, an estimator is proposed. Thus this paper presents the entire process of designing a carbohydrate estimator, which allows estimate disturbance for a predictive controller (MPC) in a T1MD patient, the estimation will be used to establish a profile of reference and improve the response of the controller by providing the estimated information of ingested carbohydrates. The dynamics of the diabetic model used are due to the equations described by the UVA/Padova model of the T1DMS simulator, the system was developed and simulated in Simulink, taking into account the noise and limitations of the glucose control system actuators.Keywords: estimation, glucose control, predictive controller, MPC, UVA/Padova
Procedia PDF Downloads 261