Search results for: testing techniques
8283 Validation of Asymptotic Techniques to Predict Bistatic Radar Cross Section
Authors: M. Pienaar, J. W. Odendaal, J. C. Smit, J. Joubert
Abstract:
Simulations are commonly used to predict the bistatic radar cross section (RCS) of military targets since characterization measurements can be expensive and time consuming. It is thus important to accurately predict the bistatic RCS of targets. Computational electromagnetic (CEM) methods can be used for bistatic RCS prediction. CEM methods are divided into full-wave and asymptotic methods. Full-wave methods are numerical approximations to the exact solution of Maxwell’s equations. These methods are very accurate but are computationally very intensive and time consuming. Asymptotic techniques make simplifying assumptions in solving Maxwell's equations and are thus less accurate but require less computational resources and time. Asymptotic techniques can thus be very valuable for the prediction of bistatic RCS of electrically large targets, due to the decreased computational requirements. This study extends previous work by validating the accuracy of asymptotic techniques to predict bistatic RCS through comparison with full-wave simulations as well as measurements. Validation is done with canonical structures as well as complex realistic aircraft models instead of only looking at a complex slicy structure. The slicy structure is a combination of canonical structures, including cylinders, corner reflectors and cubes. Validation is done over large bistatic angles and at different polarizations. Bistatic RCS measurements were conducted in a compact range, at the University of Pretoria, South Africa. The measurements were performed at different polarizations from 2 GHz to 6 GHz. Fixed bistatic angles of β = 30.8°, 45° and 90° were used. The measurements were calibrated with an active calibration target. The EM simulation tool FEKO was used to generate simulated results. The full-wave multi-level fast multipole method (MLFMM) simulated results together with the measured data were used as reference for validation. The accuracy of physical optics (PO) and geometrical optics (GO) was investigated. Differences relating to amplitude, lobing structure and null positions were observed between the asymptotic, full-wave and measured data. PO and GO were more accurate at angles close to the specular scattering directions and the accuracy seemed to decrease as the bistatic angle increased. At large bistatic angles PO did not perform well due to the shadow regions not being treated appropriately. PO also did not perform well for canonical structures where multi-bounce was the main scattering mechanism. PO and GO do not account for diffraction but these inaccuracies tended to decrease as the electrical size of objects increased. It was evident that both asymptotic techniques do not properly account for bistatic structural shadowing. Specular scattering was calculated accurately even if targets did not meet the electrically large criteria. It was evident that the bistatic RCS prediction performance of PO and GO depends on incident angle, frequency, target shape and observation angle. The improved computational efficiency of the asymptotic solvers yields a major advantage over full-wave solvers and measurements; however, there is still much room for improvement of the accuracy of these asymptotic techniques.Keywords: asymptotic techniques, bistatic RCS, geometrical optics, physical optics
Procedia PDF Downloads 2588282 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach
Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti
Abstract:
Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.Keywords: Javanese script, character recognition, statistical, automatic transliteration
Procedia PDF Downloads 3398281 Estimation of Coefficients of Ridge and Principal Components Regressions with Multicollinear Data
Authors: Rajeshwar Singh
Abstract:
The presence of multicollinearity is common in handling with several explanatory variables simultaneously due to exhibiting a linear relationship among them. A great problem arises in understanding the impact of explanatory variables on the dependent variable. Thus, the method of least squares estimation gives inexact estimates. In this case, it is advised to detect its presence first before proceeding further. Using the ridge regression degree of its occurrence is reduced but principal components regression gives good estimates in this situation. This paper discusses well-known techniques of the ridge and principal components regressions and applies to get the estimates of coefficients by both techniques. In addition to it, this paper also discusses the conflicting claim on the discovery of the method of ridge regression based on available documents.Keywords: conflicting claim on credit of discovery of ridge regression, multicollinearity, principal components and ridge regressions, variance inflation factor
Procedia PDF Downloads 4208280 Longitudinal Static and Dynamic Stability of a Typical Reentry Body in Subsonic Conditions Using Computational Fluid Dynamics
Authors: M. Jathaveda, Joben Leons, G. Vidya
Abstract:
Reentry from orbit is a critical phase in the entry trajectory. For a non-propulsive ballistic entry, static and dynamic stability play an important role in the trajectory, especially for the safe deployment of parachutes, typically at subsonic Mach numbers. Static stability of flight vehicles are being estimated through CFD techniques routinely. Advances in CFD software as well as computational facilities have enabled the estimation of the dynamic stability derivatives also through CFD techniques. Longitudinal static and dynamic stability of a typical reentry body for subsonic Mach number of 0.6 is predicted using commercial software CFD++ and presented here. Steady state simulations are carried out for α = 2° on an unstructured grid using SST k-ω model. Transient simulation using forced oscillation method is used to compute pitch damping derivatives.Keywords: stability, typical reentry body, subsonic, static and dynamic
Procedia PDF Downloads 1168279 Community Engagement: Experience from the SIREN Study in Sub-Saharan Africa
Authors: Arti Singh, Carolyn Jenkins, Oyedunni S. Arulogun, Mayowa O. Owolabi, Fred S. Sarfo, Bruce Ovbiagele, Enzinne Sylvia
Abstract:
Background: Stroke, the leading cause of adult-onset disability and the second leading cause of death, is a major public health concern particularly pertinent in Sub-Saharan Africa (SSA), where nearly 80% of all global stroke mortalities occur. The Stroke Investigative Research and Education Network (SIREN) seeks to comprehensively characterize the genomic, sociocultural, economic, and behavioral risk factors for stroke and to build effective teams for research to address and decrease the burden of stroke and other non communicable diseases in SSA. One of the first steps to address this goal was to effectively engage the communities that suffer the high burden of disease in SSA. This study describes how the SIREN project engaged six sites in Ghana and Nigeria over the past three years, describing the community engagement activities that have arisen since inception. Aim: The aim of community engagement (CE) within SIREN is to elucidate information about knowledge, attitudes, beliefs, and practices (KABP) about stroke and its risk factors from individuals of African ancestry in SSA, and to educate the community about stroke and ways to decrease disabilities and deaths from stroke using socioculturally appropriate messaging and messengers. Methods: Community Advisory Board (CABs), Focus Group Discussions (FGDs) and community outreach programs. Results: 27 FGDs with 168 participants including community heads, religious leaders, health professionals and individuals with stroke among others, were conducted, and over 60 CE outreaches have been conducted within the SIREN performance sites. Over 5,900 individuals have received education on cardiovascular risk factors and about 5,000 have been screened for cardiovascular risk factors during the outreaches. FGDs and outreach programs indicate that knowledge of stroke, as well as risk factors and follow-up evidence-based care is limited and often late. Other findings include: 1) Most recognize hypertension as a major risk factor for stroke. 2) About 50% report that stroke is hereditary and about 20% do not know organs affected by stroke. 3) More than 95% willing to participate in genetic testing research and about 85% willing to pay for testing and recommend the test to others. 4) Almost all indicated that genetic testing could help health providers better treat stroke and help scientists better understand the causes of stroke. The CABs provided stakeholder input into SIREN activities and facilitated collaborations among investigators, community members and stakeholders. Conclusion: The CE core within SIREN is a first-of-its kind public outreach engagement initiative to evaluate and address perceptions about stroke and genomics by patients, caregivers, and local leaders in SSA and has implications as a model for assessment in other high-stroke risk populations. SIREN’s CE program uses best practices to build capacity for community-engaged research, accelerate integration of research findings into practice and strengthen dynamic community-academic partnerships within our communities. CE has had several major successes over the past three years including our multi-site collaboration examining the KABP about stroke (symptoms, risk factors, burden) and genetic testing across SSA.Keywords: community advisory board, community engagement, focus groups, outreach, SSA, stroke
Procedia PDF Downloads 4288278 Robust Features for Impulsive Noisy Speech Recognition Using Relative Spectral Analysis
Authors: Hajer Rahali, Zied Hajaiej, Noureddine Ellouze
Abstract:
The goal of speech parameterization is to extract the relevant information about what is being spoken from the audio signal. In speech recognition systems Mel-Frequency Cepstral Coefficients (MFCC) and Relative Spectral Mel-Frequency Cepstral Coefficients (RASTA-MFCC) are the two main techniques used. It will be shown in this paper that it presents some modifications to the original MFCC method. In our work the effectiveness of proposed changes to MFCC called Modified Function Cepstral Coefficients (MODFCC) were tested and compared against the original MFCC and RASTA-MFCC features. The prosodic features such as jitter and shimmer are added to baseline spectral features. The above-mentioned techniques were tested with impulsive signals under various noisy conditions within AURORA databases.Keywords: auditory filter, impulsive noise, MFCC, prosodic features, RASTA filter
Procedia PDF Downloads 4258277 Evaluation of the Performance of Solar Stills as an Alternative for Brine Treatment Applying the Monte Carlo Ray Tracing Method
Authors: B. E. Tarazona-Romero, J. G. Ascanio-Villabona, O. Lengerke-Perez, A. D. Rincon-Quintero, C. L. Sandoval-Rodriguez
Abstract:
Desalination offers solutions for the shortage of water in the world, however, the process of eliminating salts generates a by-product known as brine, generally eliminated in the environment through techniques that mitigate its impact. Brine treatment techniques are vital to developing an environmentally sustainable desalination process. Consequently, this document evaluates three different geometric configurations of solar stills as an alternative for brine treatment to be integrated into a low-scale desalination process. The geometric scenarios to be studied were selected because they have characteristics that adapt to the concept of appropriate technology; low cost, intensive labor and material resources for local manufacturing, modularity, and simplicity in construction. Additionally, the conceptual design of the collectors was carried out, and the ray tracing methodology was applied through the open access software SolTrace and Tonatiuh. The simulation process used 600.00 rays and modified two input parameters; direct normal radiation (DNI) and reflectance. In summary, for the scenarios evaluated, the ladder-type distiller presented higher efficiency values compared to the pyramid-type and single-slope collectors. Finally, the efficiency of the collectors studied was directly related to their geometry, that is, large geometries allow them to receive a greater number of solar rays in various paths, affecting the efficiency of the device.Keywords: appropriate technology, brine treatment techniques, desalination, monte carlo ray tracing
Procedia PDF Downloads 718276 Numerical Modeling for Water Engineering and Obstacle Theory
Authors: Mounir Adal, Baalal Azeddine, Afifi Moulay Larbi
Abstract:
Numerical analysis is a branch of mathematics devoted to the development of iterative matrix calculation techniques. We are searching for operations optimization as objective to calculate and solve systems of equations of order n with time and energy saving for computers that are conducted to calculate and analyze big data by solving matrix equations. Furthermore, this scientific discipline is producing results with a margin of error of approximation called rates. Thus, the results obtained from the numerical analysis techniques that are held on computer software such as MATLAB or Simulink offers a preliminary diagnosis of the situation of the environment or space targets. By this we can offer technical procedures needed for engineering or scientific studies exploitable by engineers for water.Keywords: numerical analysis methods, obstacles solving, engineering, simulation, numerical modeling, iteration, computer, MATLAB, water, underground, velocity
Procedia PDF Downloads 4628275 Quantitative Comparisons of Different Approaches for Rotor Identification
Authors: Elizabeth M. Annoni, Elena G. Tolkacheva
Abstract:
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia that is a known prognostic marker for stroke, heart failure and death. Reentrant mechanisms of rotor formation, which are stable electrical sources of cardiac excitation, are believed to cause AF. No existing commercial mapping systems have been demonstrated to consistently and accurately predict rotor locations outside of the pulmonary veins in patients with persistent AF. There is a clear need for robust spatio-temporal techniques that can consistently identify rotors using unique characteristics of the electrical recordings at the pivot point that can be applied to clinical intracardiac mapping. Recently, we have developed four new signal analysis approaches – Shannon entropy (SE), Kurtosis (Kt), multi-scale frequency (MSF), and multi-scale entropy (MSE) – to identify the pivot points of rotors. These proposed techniques utilize different cardiac signal characteristics (other than local activation) to uncover the intrinsic complexity of the electrical activity in the rotors, which are not taken into account in current mapping methods. We validated these techniques using high-resolution optical mapping experiments in which direct visualization and identification of rotors in ex-vivo Langendorff-perfused hearts were possible. Episodes of ventricular tachycardia (VT) were induced using burst pacing, and two examples of rotors were used showing 3-sec episodes of a single stationary rotor and figure-8 reentry with one rotor being stationary and one meandering. Movies were captured at a rate of 600 frames per second for 3 sec. with 64x64 pixel resolution. These optical mapping movies were used to evaluate the performance and robustness of SE, Kt, MSF and MSE techniques with respect to the following clinical limitations: different time of recordings, different spatial resolution, and the presence of meandering rotors. To quantitatively compare the results, SE, Kt, MSF and MSE techniques were compared to the “true” rotor(s) identified using the phase map. Accuracy was calculated for each approach as the duration of the time series and spatial resolution were reduced. The time series duration was decreased from its original length of 3 sec, down to 2, 1, and 0.5 sec. The spatial resolution of the original VT episodes was decreased from 64x64 pixels to 32x32, 16x16, and 8x8 pixels by uniformly removing pixels from the optical mapping video.. Our results demonstrate that Kt, MSF and MSE were able to accurately identify the pivot point of the rotor under all three clinical limitations. The MSE approach demonstrated the best overall performance, but Kt was the best in identifying the pivot point of the meandering rotor. Artifacts mildly affect the performance of Kt, MSF and MSE techniques, but had a strong negative impact of the performance of SE. The results of our study motivate further validation of SE, Kt, MSF and MSE techniques using intra-atrial electrograms from paroxysmal and persistent AF patients to see if these approaches can identify pivot points in a clinical setting. More accurate rotor localization could significantly increase the efficacy of catheter ablation to treat AF, resulting in a higher success rate for single procedures.Keywords: Atrial Fibrillation, Optical Mapping, Signal Processing, Rotors
Procedia PDF Downloads 3248274 Parameter Selection and Monitoring for Water-Powered Percussive Drilling in Green-Fields Mineral Exploration
Authors: S. J. Addinell, T. Richard, B. Evans
Abstract:
The Deep Exploration Technologies Cooperative Research Centre (DET CRC) is researching and developing a new coiled tubing based greenfields mineral exploration drilling system utilising downhole water powered percussive drill tooling. This new drilling system is aimed at significantly reducing the costs associated with identifying mineral resource deposits beneath deep, barron cover. This system has shown superior rates of penetration in water-rich hard rock formations at depths exceeding 500 meters. Several key challenges exist regarding the deployment and use of these bottom hole assemblies for mineral exploration, and this paper discusses some of the key technical challenges. This paper presents experimental results obtained from the research program during laboratory and field testing of the prototype drilling system. A study of the morphological aspects of the cuttings generated during the percussive drilling process is presented and shows a strong power law relationship for particle size distributions. Several percussive drilling parameters such as RPM, applied fluid pressure and weight on bit have been shown to influence the particle size distributions of the cuttings generated. This has direct influence on other drilling parameters such as flow loop performance, cuttings dewatering, and solids control. Real-time, accurate knowledge of percussive system operating parameters will assist the driller in maximising the efficiency of the drilling process. The applied fluid flow, fluid pressure, and rock properties are known to influence the natural oscillating frequency of the percussive hammer, but this paper also shows that drill bit design, drill bit wear and the applied weight on bit can also influence the oscillation frequency. Due to the changing drilling conditions and therefore changing operating parameters, real-time understanding of the natural operating frequency is paramount to achieving system optimisation. Several techniques to understand the oscillating frequency have been investigated and presented. With a conventional top drive drilling rig, spectral analysis of applied fluid pressure, hydraulic feed force pressure, hold back pressure and drill string vibrations have shown the presence of the operating frequency of the bottom hole tooling. Unfortunately, however, with the implementation of a coiled tubing drilling rig, implementing a positive displacement downhole motor to provide drill bit rotation, these signals are not available for interrogation at the surface and therefore another method must be considered. The investigation and analysis of ground vibrations using geophone sensors, similar to seismic-while-drilling techniques have indicated the presence of the natural oscillating frequency of the percussive hammer. This method is shown to provide a robust technique for the determination of the downhole percussive oscillation frequency when used with a coiled tubing drill rig.Keywords: cuttings characterization, drilling optimization, oscillation frequency, percussive drilling, spectral analysis
Procedia PDF Downloads 2308273 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 3708272 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection
Authors: S. Shankar Bharathi
Abstract:
Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision
Procedia PDF Downloads 4278271 Software Evolution Based Activity Diagrams
Authors: Zine-Eddine Bouras, Abdelouaheb Talai
Abstract:
During the last two decades, the software evolution community has intensively tackled the software merging issue whose main objective is to merge in a consistent way different versions of software in order to obtain a new version. Well-established approaches, mainly based on the dependence analysis techniques, have been used to bring suitable solutions. These approaches concern the source code or software architectures. However, these solutions are more expensive due to the complexity and size. In this paper, we overcome this problem by operating at a high level of abstraction. The objective of this paper is to investigate the software merging at the level of UML activity diagrams, which is a new interesting issue. Its purpose is to merge activity diagrams instead of source code. The proposed approach, based on dependence analysis techniques, is illustrated through an appropriate case study.Keywords: activity diagram, activity diagram slicing, dependency analysis, software merging
Procedia PDF Downloads 3278270 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram
Authors: Mehwish Asghar
Abstract:
Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence
Procedia PDF Downloads 2258269 Non-Invasive Techniques for Management of Carious Primary Dentition Using Silver Diamine Fluoride and Moringa Extract as a Modification of the Hall Technique
Authors: Rasha F. Sharaf
Abstract:
Treatment of dental caries in young children is considered a great challenge for all dentists, especially with uncooperative children. Recently non-invasive techniques have been highlighted as they alleviate the need for local anesthesia and other painful procedures during management of carious teeth and, at the same time, increase the success rate of the treatment done. Silver Diamine Fluoride (SDF) is one of the most effective cariostatic materials that arrest the progression of carious lesions and aid in remineralizing the demineralized tooth structure. Both fluoride and silver ions proved to have an antibacterial action and aid in the precipitation of an insoluble layer that prevents further decay. At the same time, Moringa proved to have an effective antibacterial action against different types of bacteria, therefore, it can be used as a non-invasive technique for the management of caries in children. One of the important theories for the control of caries is by depriving the cariogenic bacteria from nutrients causing their starvation and death, which can be achieved by applying stainless steel crown on primary molars with carious lesions which are not involving the pulp, and this technique is known as Hall technique. The success rate of the Hall technique can be increased by arresting the carious lesion using either SDF or Moringa and gaining the benefit of their antibacterial action. Multiple clinical cases with 1 year follow up will be presented, comparing different treatment options, and using various materials and techniques for non-invasive and non-painful management of carious primary teeth.Keywords: SDF, hall technique, carious primary teeth, moringa extract
Procedia PDF Downloads 968268 Study of Education Learning Techniques and Game Genres
Authors: Khadija Al Farei, Prakash Kumar, Vikas Rao Naidu
Abstract:
Games are being developed with different genres for different age groups, for many decades. In many places, educational games are playing a vital role for active classroom environment and better learning among students. Currently, the educational games have assumed an important place in children and teenagers lives. The role of educational games is important for improving the learning capability among the students especially of this generation, who really live among electronic gadgets. Hence, it is now important to make sure that in our educational system, we are updated with all such advancement in technologies. Already much research is going on in this area of edutainment. This research paper will review around ten different research papers to find the relation between the education learning techniques and games. The result of this review provides guidelines for enhanced teaching and learning solutions in education. In-house developed educational games proved to be more effective, compared to the one which is readily available in the market.Keywords: education, education game, educational technology, edutainment, game genres, gaming in education
Procedia PDF Downloads 4158267 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation
Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi
Abstract:
Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.Keywords: integral production, level set method, morphological operation, segmentation
Procedia PDF Downloads 3178266 Performance Evaluation of One and Two Dimensional Prime Codes for Optical Code Division Multiple Access Systems
Authors: Gurjit Kaur, Neena Gupta
Abstract:
In this paper, we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension, i.e. time slots, whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research, we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for Optical Code Division Multiple Access (OCDMA) system on a single platform. Analysis shows that 2D prime code supports lesser number of active users than 1D codes, but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa
Procedia PDF Downloads 3378265 Key Parameters for Controlling Swell of Expansive Soil-Hydraulic Cement Admixture
Authors: Aung Phyo Kyaw, Kuo Chieh Chao
Abstract:
Expansive soils are more complicated than normal soils, although the soil itself is not very complicated. When evaluating foundation performance on expansive soil, it is important to consider soil expansion. The primary focus of this study is on hydraulic cement and expansive soil mixtures, and the research aims to identify key parameters for controlling the swell of the expansive soil-hydraulic cement mixture. Treatment depths can be determined using hydraulic cement ratios of 4%, 8%, 12%, and 15% for treating expansive soil. To understand the effect of hydraulic cement percentages on the swelling of expansive soil-hydraulic admixture, performing the consolidation-swell test σ''ᶜˢ is crucial. This investigation primarily focuses on consolidation-swell tests σ''ᶜˢ, although the heave index Cₕ is also needed to determine total heave. The heave index can be measured using the percent swell in the specific inundation stress in both the consolidation-swell test and the constant-volume test swelling pressure. Obtaining the relationship between swelling pressure and σ''ᶜⱽ determined from the "constant volume test" is useful in predicting heave from a single oedometer test. The relationship between σ''ᶜˢ and σ''ᶜⱽ is based on experimental results of expansive soil behavior and facilitates heave prediction for each soil. In this method, the soil property "m" is used as a parameter, and common soil property tests include compaction, particle size distribution, and the Atterberg limit. The Electricity Generating Authority of Thailand (EGAT) provided the soil sample for this study, and all laboratory testing is performed according to American Society for Testing and Materials (ASTM) standards.Keywords: expansive soil, swelling pressure, total heave, treatment depth
Procedia PDF Downloads 858264 Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System
Authors: Siobhan O’Shea, Sangeetha Vijaysri Nair, Hee Cheol Kim, Charles Thomas Nugent, Cheuk Yan William Tong, Sam Douthwaite, Andrew Worlock
Abstract:
The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription-Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the Roche assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the Roche assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.Keywords: HIV viral load, Aptima, Roche, Panther system
Procedia PDF Downloads 3758263 Systematic and Meta-Analysis of Navigation in Oral and Maxillofacial Trauma and Impact of Machine Learning and AI in Management
Authors: Shohreh Ghasemi
Abstract:
Introduction: Managing oral and maxillofacial trauma is a multifaceted challenge, as it can have life-threatening consequences and significant functional and aesthetic impact. Navigation techniques have been introduced to improve surgical precision to meet this challenge. A machine learning algorithm was also developed to support clinical decision-making regarding treating oral and maxillofacial trauma. Given these advances, this systematic meta-analysis aims to assess the efficacy of navigational techniques in treating oral and maxillofacial trauma and explore the impact of machine learning on their management. Methods: A detailed and comprehensive analysis of studies published between January 2010 and September 2021 was conducted through a systematic meta-analysis. This included performing a thorough search of Web of Science, Embase, and PubMed databases to identify studies evaluating the efficacy of navigational techniques and the impact of machine learning in managing oral and maxillofacial trauma. Studies that did not meet established entry criteria were excluded. In addition, the overall quality of studies included was evaluated using Cochrane risk of bias tool and the Newcastle-Ottawa scale. Results: Total of 12 studies, including 869 patients with oral and maxillofacial trauma, met the inclusion criteria. An analysis of studies revealed that navigation techniques effectively improve surgical accuracy and minimize the risk of complications. Additionally, machine learning algorithms have proven effective in predicting treatment outcomes and identifying patients at high risk for complications. Conclusion: The introduction of navigational technology has great potential to improve surgical precision in oral and maxillofacial trauma treatment. Furthermore, developing machine learning algorithms offers opportunities to improve clinical decision-making and patient outcomes. Still, further studies are necessary to corroborate these results and establish the optimal use of these technologies in managing oral and maxillofacial traumaKeywords: trauma, machine learning, navigation, maxillofacial, management
Procedia PDF Downloads 588262 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records
Authors: Sara ElElimy, Samir Moustafa
Abstract:
Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).Keywords: big data analytics, machine learning, CDRs, 5G
Procedia PDF Downloads 1398261 Cold Spray High Entropy Alloy Coating Surface Microstructural Characterization and Mechanical Testing
Authors: Raffaella Sesana, Nazanin Sheibanian, Luca Corsaro, Sedat Özbilen, Rocco Lupoi, Francesco Artusio
Abstract:
High Entropy Alloy (HEA) coatings of Al0.1-0.5CoCrCuFeNi and MnCoCrCuFeNi on Mg substrates were prepared from mechanically alloyed HEA powder feedstocks and at three different Cold Spray (CS) process gas (N2) temperatures (650, 750 and 850°C). Mechanically alloyed and cold-sprayed HEA coatings were characterized by macro photography, OM, SEM+EDS study, micro-hardness testing, roughness, and porosity measurements. As a result of mechanical alloying (MA), harder particles are deformed and fractured. The particles in the Cu-rich region were coarser and more globular than those in the A1 phase, which is relatively soft and ductile. In addition to the A1 particles, there were some separate Cu-rich regions. Due to the brittle nature of the powder and the acicular shape, Mn-HEA powder exhibited a different trend with smaller particle sizes. It is observed that MA results in a loose structure characterized by many gaps, cracks, signs of plastic deformation, and small particles attached to the surface of the particle. Considering the experimental results obtained, it is not possible to conclude that the chemical composition of the high entropy alloy influences the roughness of the coating. It has been observed that the deposited volume increases with temperature only in the case of Al0.1 and Mg-based HEA, while for the rest of the Al-based HEA, there are no noticeable changes. There is a direct correlation between micro-hardness and the chemical composition of a coating: the micro-hardness of a coating increases as the percentage of aluminum increases in the sample. Compared to the substrate, the coating has a much higher hardness, and the hardness measured at the interface is intermediate.Keywords: characterisation, cold spraying, HEA coatings, SEM+EDS
Procedia PDF Downloads 648260 Artificial Intelligence in Melanoma Prognosis: A Narrative Review
Authors: Shohreh Ghasemi
Abstract:
Introduction: Melanoma is a complex disease with various clinical and histopathological features that impact prognosis and treatment decisions. Traditional methods of melanoma prognosis involve manual examination and interpretation of clinical and histopathological data by dermatologists and pathologists. However, the subjective nature of these assessments can lead to inter-observer variability and suboptimal prognostic accuracy. AI, with its ability to analyze vast amounts of data and identify patterns, has emerged as a promising tool for improving melanoma prognosis. Methods: A comprehensive literature search was conducted to identify studies that employed AI techniques for melanoma prognosis. The search included databases such as PubMed and Google Scholar, using keywords such as "artificial intelligence," "melanoma," and "prognosis." Studies published between 2010 and 2022 were considered. The selected articles were critically reviewed, and relevant information was extracted. Results: The review identified various AI methodologies utilized in melanoma prognosis, including machine learning algorithms, deep learning techniques, and computer vision. These techniques have been applied to diverse data sources, such as clinical images, dermoscopy images, histopathological slides, and genetic data. Studies have demonstrated the potential of AI in accurately predicting melanoma prognosis, including survival outcomes, recurrence risk, and response to therapy. AI-based prognostic models have shown comparable or even superior performance compared to traditional methods.Keywords: artificial intelligence, melanoma, accuracy, prognosis prediction, image analysis, personalized medicine
Procedia PDF Downloads 818259 Discrete Breeding Swarm for Cost Minimization of Parallel Job Shop Scheduling Problem
Authors: Tarek Aboueldahab, Hanan Farag
Abstract:
Parallel Job Shop Scheduling Problem (JSP) is a multi-objective and multi constrains NP- optimization problem. Traditional Artificial Intelligence techniques have been widely used; however, they could be trapped into the local minimum without reaching the optimum solution, so we propose a hybrid Artificial Intelligence model (AI) with Discrete Breeding Swarm (DBS) added to traditional Artificial Intelligence to avoid this trapping. This model is applied in the cost minimization of the Car Sequencing and Operator Allocation (CSOA) problem. The practical experiment shows that our model outperforms other techniques in cost minimization.Keywords: parallel job shop scheduling problem, artificial intelligence, discrete breeding swarm, car sequencing and operator allocation, cost minimization
Procedia PDF Downloads 1888258 Parallel Self Organizing Neural Network Based Estimation of Archie’s Parameters and Water Saturation in Sandstone Reservoir
Authors: G. M. Hamada, A. A. Al-Gathe, A. M. Al-Khudafi
Abstract:
Determination of water saturation in sandstone is a vital question to determine the initial oil or gas in place in reservoir rocks. Water saturation determination using electrical measurements is mainly on Archie’s formula. Consequently accuracy of Archie’s formula parameters affects water saturation values rigorously. Determination of Archie’s parameters a, m, and n is proceeded by three conventional techniques, Core Archie-Parameter Estimation (CAPE) and 3-D. This work introduces the hybrid system of parallel self-organizing neural network (PSONN) targeting accepted values of Archie’s parameters and, consequently, reliable water saturation values. This work focuses on Archie’s parameters determination techniques; conventional technique, CAPE technique, and 3-D technique, and then the calculation of water saturation using current. Using the same data, a hybrid parallel self-organizing neural network (PSONN) algorithm is used to estimate Archie’s parameters and predict water saturation. Results have shown that estimated Arche’s parameters m, a, and n are highly accepted with statistical analysis, indicating that the PSONN model has a lower statistical error and higher correlation coefficient. This study was conducted using a high number of measurement points for 144 core plugs from a sandstone reservoir. PSONN algorithm can provide reliable water saturation values, and it can supplement or even replace the conventional techniques to determine Archie’s parameters and thereby calculate water saturation profiles.Keywords: water saturation, Archie’s parameters, artificial intelligence, PSONN, sandstone reservoir
Procedia PDF Downloads 1288257 Study of Bima Tembe and Its Relation to Rimpu as a Cultural Women Clothes in Bima
Authors: Morinta Rosandini
Abstract:
Bima Tembe is an excellent sample of cultural artifact that many people regard it as: (1) manufactured by a traditional techniques, (2) contained with variety forms and great philosophical motifs, and (3) having valued functions related to women status in the society. This research examined elements of Bima Tembe and their relations and one of the usage of tembe, named Rimpus. The elements include: (1) the traditional techniques of making Bima Tembe, (2) the variety forms (3) and philosophical motifs of Bima Tembe. Rimpu, is a cultural women clothes in Bima, which use Bima Tembe as a main part. From this reseacrh found that the Bima Tembe made by weaving technique using a traditional loom, and has two types of Tembe; Tembe Istana and Tembe Rakyat, with various motif each type. The The usage of Rimpu is as a symbol of the obedience to God and the type of Rimpu indicate the women status in the society.Keywords: bima, tembe, rimpu, clothes
Procedia PDF Downloads 4218256 A Time since of Injection Model for Hepatitis C Amongst People Who Inject Drugs
Authors: Nader Al-Rashidi, David Greenhalgh
Abstract:
Mathematical modelling techniques are now being used by health organizations worldwide to help understand the likely impact that intervention strategies treatment options and combinations of these have on the prevalence and incidence of hepatitis C virus (HCV) in the people who inject drugs (PWID) population. In this poster, we develop a deterministic, compartmental mathematical model to approximate the spread of the HCV in a PWID population that has been divided into two groups by time since onset of injection. The model assumes that after injection needles adopt the most infectious state of their previous state or that of the PWID who last injected with them. Using analytical techniques, we find that the model behaviour is determined by the basic reproductive number R₀, where R₀ = 1 is a critical threshold separating two different outcomes. The disease-free equilibrium is globally stable if R₀ ≤ 1 and unstable if R₀ > 1. Additionally, we make some simulations where have confirmed that the model tends to this endemic equilibrium value with realistic parameter values giving an HCV prevalence.Keywords: hepatitis C, people who inject drugs, HCV, PWID
Procedia PDF Downloads 1448255 An Analysis of the Impact of Government Budget Deficits on Economic Performance. A Zimbabwean Perspective
Authors: Tafadzwa Shumba, Rose C. Nyatondo, Regret Sunge
Abstract:
This research analyses the impact of budget deficits on the economic performance of Zimbabwe. The study employs the autoregressive distributed lag (ARDL) confines testing method to co-integration and long-run estimation using time series data from 1980-2018. The Augmented Dick Fuller (ADF) and the Granger approach were used to testing for stationarity and causality among the factors. Co-integration test results affirm a long term association between GDP development rate and descriptive factors. Causality test results show a unidirectional connection between budget shortfall to GDP development and bi-directional causality amid debt and budget deficit. This study also found unidirectional causality from debt to GDP growth rate. ARDL estimates indicate a significantly positive long term and significantly negative short term impact of budget shortfall on GDP. This suggests that budget deficits have a short-run growth retarding effect and a long-run growth-inducing effect. The long-run results follow the Keynesian theory that posits that fiscal deficits result in an increase in GDP growth. Short-run outcomes follow the neoclassical theory. In light of these findings, the government is recommended to minimize financing of recurrent expenditure using a budget deficit. To achieve sustainable growth and development, the government needs to spend an absorbable budget deficit focusing on capital projects such as the development of human capital and infrastructure.Keywords: ARDL, budget deficit, economic performance, long run
Procedia PDF Downloads 978254 Stochastic Control of Decentralized Singularly Perturbed Systems
Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan
Abstract:
Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.Keywords: decentralized, optimal control, output, singular perturb
Procedia PDF Downloads 370